Track_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 12 天前Hexadecimalslrpnk.netimagemessage-square142fedilinkarrow-up11.07Karrow-down123
arrow-up11.04Karrow-down1imageHexadecimalslrpnk.netTrack_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 12 天前message-square142fedilink
minus-squarevvilld@lemmy.worldlinkfedilinkarrow-up2·10 天前I meant, is hosting it locally something someone without a coding background can do easily
minus-squareFillicialinkfedilinkarrow-up2·10 天前Without a coding background, yes. For someone technically illiterate it might be an issue. You can get a good starting point looking at Ollama
minus-squarerumba@lemmy.ziplinkfedilinkEnglisharrow-up2·10 天前Oh, Yeah it’s not bad. You can install Ollama, docker, then install open-webui in docker. Tell openwebui to go get deepseek instructions: https://archive.is/fOWXO or you can try pinokio.computer or jan.ai
minus-squareRaptoroxlinkfedilinkarrow-up1·8 天前A really simple way is to use LM Studio. You just install and select deepseek-r1, default is 7B iirc
I meant, is hosting it locally something someone without a coding background can do easily
Without a coding background, yes.
For someone technically illiterate it might be an issue.
You can get a good starting point looking at Ollama
Oh, Yeah it’s not bad.
You can install Ollama, docker, then install open-webui in docker. Tell openwebui to go get deepseek
instructions: https://archive.is/fOWXO
or you can try pinokio.computer or jan.ai
A really simple way is to use LM Studio. You just install and select deepseek-r1, default is 7B iirc