Hello, I’ve been hearing a lot about this new DeepSeek LLM, and was wondering, would it be possible to get the 600+ billion parameter model running on my GPU? I’ve heard something about people have got it to run on their MacBooks. I have i7 4790K, 32GB DDR3, and 7900 XTX 24GB VRAM. I’m running Arch Linux, this computer is just for AI stuff really, not gaming as much. I did tried running the distilled 14B parameter model, but it didn’t work for me, I was using GPT4All to run it. I’m thinking about getting one of the NVIDIA 5090s in the future. Thanks in advance!
You can use the Terminal or something like AnythingLLM. It has a GUI and you can import pictures and Websites.