I second the roku, they are economically priced, they perform very well, minimal ads. My friends and family use them to stream from my jellyfin server over the internet to their homes.
Well, openERP or openbravo were what I would have recommended ten years ago, but due to their commercialization aren’t really relevant any longer. If I personally was setting this up for myself I would probably use redmine and a plugin that gives redmine the invoice functionality. However I wouldn’t call it simple for a first timer to pull off, but if redmine is mastered you will find very extensible and customizable to any particular project’s needs.
The effect is so small and decays exponentially getting smaller and smaller each step so it never really reaches zero, the eventual heat death of the universe will happen way before this effect has time to do its thing.
I would like to add to this, I expect the jargon and equations to fly far overhead most of the general public. So I will try to sum the entire process here:
My initial idea was that the horizon of our universe was “leaking” energy from its horizon, just as a black hole does. If this was true then according to e=mc^2 the universes mass should decrease over time.
I had earlier made the connection that dark matter and energy maybe a geometric phenomena and wrote a thought experiment outlining the idea. If we take two observers and keep them stationary in space, then in some fantastic way shrink them both at the same rate at the same time without the observers knowing, the view of the observers would be that they are moving away from each other as they stay relatively the same size to each other and the distance increases according to their mass and thereby volume decreasing.
These two basic ideas together seemed to say that as the universe losses energy from its horizon like a black hole, the atoms and point masses inside all lose a tiny amount of mass, and thereby each point’s volume decreases a tiny amount. This causes distance to increase ever so slightly between the particles. This entire effect over every particle in the universe together gives us the expanding of distance we call dark energy. This same effect when taken with relativity and the laws of motion produce a dark matter signal as well.
Lastly gravity was already known to probably be quantum in nature, so digging on further revealed this interesting, if somewhat novel view of the universe and everything in it.
I have some of my previous papers hosted on researchub, this link leads to a post with further information about the initial research that lead to this paper, I continue to use researchhub for further work.
I hope this additional information has helped you in some way to understand this idea I call special gravity. Cheers!
Lol, I really get that vibe to, but it is very interesting :P
I completely agree with every word, it was the observations alone of dark energy and matter that led me in this direction.
At one time I tried to describe them with an unknown fifth dimension, but later realized that’s only an abstraction. Perhaps just maybe black holes and universes share this property of evaporation, which if so, would have interesting consequences.
I have the thought experiment to go along with the paper, if you’d like to see that at https://madhakker.com/ just scroll down one post. That was from when I was trying the fifth dimension angle, but it does a good job of describing a dark matter like signal in the terms of a changing mass.
Also you’re asking about multi gpu, I have a few other cards stuffed in my backplane. The GeForce GTX 1050 Ti has 4GB of vram, and is comparable to the P40 in performance. I have split a larger 33B model on the two cards. Splitting a large model is of course slower than running on one card alone, but is much faster than cpu (even with 48 threads). However speed when splitting depends on the speed of the pci-e bus, which for me is limited to gen 1 speeds for now. If you have a faster/newer pci-e standard then you’ll see better results than me.
Correct my backplane doesn’t have the flow of big server box, also another gotcha is the P40 uses a 8-pin CPU power plug not a 8-pin GPU
Edit 8 pin not 6 pin
The P40 doesn’t have active cooling, it really needs forced air flow which I grabbed one of these for
https://www.ebay.com/itm/285241802202
It’s even cheaper now than when I bought mine.
I have a p40 I’d be glad to run a benchmark on, just tell me how. I have Ooba and llama.cpp installed on linux Ubuntu 22.04, it’s a Dell r620 with 2 x 12 3.5 Ghz cores (2 threads per core for 48 threads) Xeon with 256GB ram @ 1833Mhz, I have a pci-e gen 1 20 slot backplane. The speed of the pci-e bus might impact the loading time of the large models, but seems to not affect the speed of inference.
I went for the p40 for costs per GB of vram, speed was less important to me than being able to load the larger models at all. Including the fan and fan coupling i’m all in about $250 per card. I’m planning on adding more in the future, I to suffer from too many pci-e slots.
The cuda version I dont think will become an issue anytime to soon but is coming to be sure.
My setup is made up of two “servers” one 20 bay SAS/sata unraid server and one Dell r620 with a 20 pci-e slot backplane expansion.
[1] Unraid box 49TB Norco-4220
ASUSTeK Computer INC. P7H55D-M EVO , Version Rev 1.xx
Intel® Core™ i3 CPU 530 16GB RAM
Docker:
Virtual machine Debian Bullseye:
[2] Ubuntu 22.04 Dell r620 with OSS-BP20 backplane
2x Intel® Xeon® CPU E5-2697 v2
256GB RAM
8x 256GB ssd in raid 0
Public facing services for you to poke around at are:
If the horizon of the universe is like the horizon of a blackhole then the energy loss through Hawking radiation through the converstion e=mc^2 simply implies that mass is lost from the universe over time. If we extrapolate out this energy/mass loss over time for every mass in the universe then the distance between the surfaces of each grow as a relative change with the exponentially decreasing mass over time, directly correlating the dark phenomena we observe as a geometric quantum event.