Howdy!

(moved this comment from the noob question thread because no replies)

I’m not a total noob when it comes to general compute and AI. I’ve been using online models for some time, but I’ve never tried to run one locally.

I’m thinking about buying a new computer for gaming and for running/testing/developing LLMs (not training, only inference and in context learning) . My understanding is that ROCm is becoming decent (and I also hate Nvidia) , so I’m thinking that a Radeon Rx 7900 XTX might be a good start. If I buy the right motherboard I should be able to put another XTX in there as well, later. If I use watercooling.

So first, what do you think about this? Are the 24 gigs of VRAM worth the extra bucks? Or should I just go for a mid-range GPU like the Arc B580?

I’m also curious experimenting with a no-GPU setup. I.e. CPU + lots of RAM. What kind of models do you think I’ll be able to run, with decent performance, if I have something like a Ryzen 7 9800X3D and 128/256 GB of DDR5? How does it compare to the Radeon RX 7900 XTX? Is it possible to utilize both CPU and GPU when running inference with a single model, or is it either or?

Also… Is it not better if noobs post questions in the main thread? Then questions will probably reach more people. It’s not like there is super much activity…

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    5 days ago

    I think this warrants an extra post. And the beginners thread is a year old and I guess not a lot of people watch comments there.

    I use KoboldCpp and like to recommend that to people who are new to the hobby or don’t own a proper gaming rig. It’s relatively easy to install and you can try it now, without any GPU, and see if you like it. I’d say it’s usable on CPU up to about 13B (with quantized models). Of course it’ll be orders of magnitude slower than a GPU.

    I’d say every bit of VRAM counts. So you might as well buy as much as you can afford. And you’ll be able to run more intelligent models. Use one of the VRAM calculators to see what fits in 16GB or 24GB. And if you need that model and context size.

    Edit: And mixing GPU and CPU makes everything considerably slower. It’s a trade-off for people with less VRAM. But in case you buy something new, you should try to fit everything into the GPU only.