Tea@programming.dev to Technology@lemmy.worldEnglish · 6 天前Nvidia creates gaming-centric AI chatbot that runs on your GPU, locally.www.nvidia.comexternal-linkmessage-square14fedilinkarrow-up160arrow-down114
arrow-up146arrow-down1external-linkNvidia creates gaming-centric AI chatbot that runs on your GPU, locally.www.nvidia.comTea@programming.dev to Technology@lemmy.worldEnglish · 6 天前message-square14fedilink
minus-squareWolfLinklinkfedilinkEnglisharrow-up20·6 天前You can run your own LLM chatbot with https://ollama.com/ They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.
You can run your own LLM chatbot with https://ollama.com/
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.