Wait a minute, you’re teling me I can give my money to a spellchecker? Insane! Sign me up!
Why would anyone use this? What is rhe value-add?
It’s a decentralized, open-source platform akin to what Lemmy is to Reddit. Allows sharing hardware resources amongst users and offers a more cost-effective solution compared to commercial offerings. It’s like a sophisticated text prediction engine, not just a spell checker.
^ Reply generated from our model as an example. Judging by your comment history you are very critical about the capabilities of large language models, but that’s totally fine. We are open to criticism and admit llm’s don’t possess extremely profound capabilities yet. However, we are trying to prepare for a future where these models might attain an order of magnitude more parameters and the cost of compute inevitably comes down.
How are you prepared for the fact that these LLM models are being run locally? All anyone needs is a GPU. The same GPU they could use for crypto, but could instead use for LLM? Making your coin useless?
We don’t use proof of work. We plan on using proof of stake. There is no need to use the GPU for mining crypto, this is a proven model used by ethereum. Additionally, yes anyone could run models locally. However, the best models require $1,000s of dollars worth of hardware to run. Makes more sense to share it with whoever needs it at the time then buy $1,000s in equipment just to run a model you might use only a few times per day. Additionally, if you needed to run multiple queries at once you are stuck waiting for them to be completed sequentially. With this network you could submit them and have them completed in parallel. Not to mention that training models requires far more than just a consumer GPU and would require something like a DGX A100 to do in any reasonable amount of time, a $200,000 system. Here, we could train on hundreds of consumer GPUs concurrently.
So you rent GPUs from users and in return give them a coin which totally makes up for the time used. Just like turo but for gpus. I’ll pass. I don’t buy your claim that the best models take lots of hardware to run. The best models require vram but how are you going to share vram over the internet?
The input and output are text, very easy to send over the internet. Yes, the best models need VRAM, how that’s different than “lots of hardware” idk. How do you think chatGPT works? You send them text and rent GPU time and in return get text. We are proposing the same thing, but distributed. And yes like Turo, but for GPUs. Very similar to cloud gaming, another use case for renting GPU time. However, unlike cloud gaming we don’t have bandwidth limitations. We can even have several users with small amounts of VRAM combine it to use very large models by splitting the model into sections. This has been accomplished: https://github.com/bigscience-workshop/petals we are proposing a validation method to verify whether the output is correct. Additionally, we are offering an incentive structure. The problem with petals is it relies on charity.
The key difference here though is that ChatGPT does not rent GPU time from its users. It rents GPU time from datacenters, for fiat currency.
Datacenters take that money and use it to buy new hardware and to maintain that hardware. If I spent my money for my hardware, I would want to rent it out at a decent rate. After all, unlike datacenters who buy electricity in bulk, I have to pay residential costs for electricity usage.
I am not convinced that a crypto payment in an unknown coin could make up for what I spend in electricity. I would expect to earn something similar to proof of work models, maybe about $1 after a month’s worth of processing. After electricity costs, that’s going to cost me probably $10 or more, making it not worth my while.
Maybe it will be worthwhile for others. But for most peope, they could likely just install serge or LLaMa and get more “bang for the buck”. I guess that all comes down to how you hand out the coin and what other perks come with being a miner / staking node / person with a gpu.
I mean, if I don’t get paid for it, could I at least use the network for free?
deleted by creator