Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.
People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.
Removed by mod
In their defense, Mozilla did create the easiest way to run and integrate an LLM locally, so if anyone could do it, I imagine it would be them.
https://github.com/Mozilla-Ocho/llamafile
Removed by mod
Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.
People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.
Removed by mod
Yes, and then you asked for ideas, which were in that comment that you replied to.
Removed by mod