capybeby to Lemmy Shitpost@lemmy.world · 9 天前Yes that is definitely what I was going for, thank youimagemessage-square34fedilinkarrow-up1544arrow-down15
arrow-up1539arrow-down1imageYes that is definitely what I was going for, thank youcapybeby to Lemmy Shitpost@lemmy.world · 9 天前message-square34fedilink
minus-squareayyylinkfedilinkarrow-up2·8 天前Apple’s autocorrect is based on GPT, of chatGPT fame, but it’s the GPT2 model which came out around 2015. Back then it ran on supercomputers, now it fits in your pocket. ChatGPT uses the GPT4 model which requires really big supercomputers for now.
minus-squaregmtom@lemmy.worldlinkfedilinkarrow-up3·8 天前That’s to train it. Even the most modern version of AIs like deepseek or get can run on a raspberry pi.
minus-squareayyylinkfedilinkarrow-up4·8 天前Only if you distill it to the point of uselessness (well, even more useless). Otherwise you wouldn’t have headlines like this.
minus-squaregmtom@lemmy.worldlinkfedilinkarrow-up1·8 天前No? I litterally run a local deepseek model on a raspberry pi to use as an alexa-like personal assistant.
minus-squareayyylinkfedilinkarrow-up2·8 天前Yea I guess it works fine for anything that doesn’t require a real context window.
Apple’s autocorrect is based on GPT, of chatGPT fame, but it’s the GPT2 model which came out around 2015. Back then it ran on supercomputers, now it fits in your pocket. ChatGPT uses the GPT4 model which requires really big supercomputers for now.
That’s to train it. Even the most modern version of AIs like deepseek or get can run on a raspberry pi.
Only if you distill it to the point of uselessness (well, even more useless). Otherwise you wouldn’t have headlines like this.
No? I litterally run a local deepseek model on a raspberry pi to use as an alexa-like personal assistant.
Yea I guess it works fine for anything that doesn’t require a real context window.