Tea@programming.dev to Technology@lemmy.worldEnglish · 4 days agoNvidia creates gaming-centric AI chatbot that runs on your GPU, locally.www.nvidia.comexternal-linkmessage-square14fedilinkarrow-up159arrow-down114
arrow-up145arrow-down1external-linkNvidia creates gaming-centric AI chatbot that runs on your GPU, locally.www.nvidia.comTea@programming.dev to Technology@lemmy.worldEnglish · 4 days agomessage-square14fedilink
minus-squareWolfLinklinkfedilinkEnglisharrow-up20·4 days agoYou can run your own LLM chatbot with https://ollama.com/ They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.
You can run your own LLM chatbot with https://ollama.com/
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.