If being used in that context, the person using it is an idiot.
huggingface.co/models – shows many of the things AI is being used for. And even in the context of only LLMs and Diffusors, you cannot claim that LLMs are worthless with a straight face.
It’s not a question of “worthless” so much as “net benefit”. How much money and manpower are we investing in the tools?
Because, right now, the Sam Altman approach to LLMs is to simply throw more compute at the problem forever. The degree to which he seems interested in reinventing the model or the foundational technology pales beside his demands for GWhs of new power to brute force a better solution.
If you’re spending $1T to do $100B worth of human labor, that’s not any kind of efficency.
I understand the sentiment but there are very few technologies that didn’t need a disproportionate amount of research and development before seeing proper “net benefits”.
The trash bin of history is full of ideas that absorbed enormous amounts of resources and labor, only to flounder on implementation. The idea that Sam Altman’s pet project just needs another trillion to take off is heavily predicated on him building the next Model T and not the next Hindenburg.
The Hindenburg exploded due to political reasons. It was capable of flying with helium, but the US was the biggest He producer in the world during the time.
Americans rapidly constructing for-profit nuclear power plants to power AI server farms without any kind of plan on where to source fuel or dispose of waste won’t know anything about this.
I think “AI” has just become shorthand for LLMs and diffusers though in general conversation.
If being used in that context, the person using it is an idiot.
huggingface.co/models – shows many of the things AI is being used for. And even in the context of only LLMs and Diffusors, you cannot claim that LLMs are worthless with a straight face.
It’s not a question of “worthless” so much as “net benefit”. How much money and manpower are we investing in the tools?
Because, right now, the Sam Altman approach to LLMs is to simply throw more compute at the problem forever. The degree to which he seems interested in reinventing the model or the foundational technology pales beside his demands for GWhs of new power to brute force a better solution.
If you’re spending $1T to do $100B worth of human labor, that’s not any kind of efficency.
I understand the sentiment but there are very few technologies that didn’t need a disproportionate amount of research and development before seeing proper “net benefits”.
Brute forcing is the least efficient R&D. Best efficiency was achieved at Bell labs, ARPA, early NASA, Xerox PARC.
The trash bin of history is full of ideas that absorbed enormous amounts of resources and labor, only to flounder on implementation. The idea that Sam Altman’s pet project just needs another trillion to take off is heavily predicated on him building the next Model T and not the next Hindenburg.
The Hindenburg exploded due to political reasons. It was capable of flying with helium, but the US was the biggest He producer in the world during the time.
Americans rapidly constructing for-profit nuclear power plants to power AI server farms without any kind of plan on where to source fuel or dispose of waste won’t know anything about this.