baatliwala@lemmy.world to memes@lemmy.world · 2 days agoThe AI revolution is cominglemmy.worldimagemessage-square77fedilinkarrow-up1320arrow-down138
arrow-up1282arrow-down1imageThe AI revolution is cominglemmy.worldbaatliwala@lemmy.world to memes@lemmy.world · 2 days agomessage-square77fedilink
minus-squareMora@pawb.sociallinkfedilinkarrow-up1·1 day agoAs someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-squarekyoji@lemmy.worldlinkfedilinkarrow-up2·17 hours agoI also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
minus-squareLurkerlinkfedilinkarrow-up2·1 day agoYou can try from lowest to bigger. You probably can run biggest too but it will be slow.
Deepseek is good locally.
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
You can try from lowest to bigger. You probably can run biggest too but it will be slow.