In a totally unrelated note, https://msty.app/ is extremely easy to use and runs the LLM locally. Good choices are Llama 3.2; Granite; Deepseek r1; Dolphin 3… can run on Nvidia, Apple and cpu. They say AMD too and know how to since November but it’s not working. Not as friendly but https://lmstudio.ai/ runs on just about any hardware.
In a totally unrelated note, https://msty.app/ is extremely easy to use and runs the LLM locally. Good choices are Llama 3.2; Granite; Deepseek r1; Dolphin 3… can run on Nvidia, Apple and cpu. They say AMD too and know how to since November but it’s not working. Not as friendly but https://lmstudio.ai/ runs on just about any hardware.