ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square296fedilinkarrow-up11.02Karrow-down114 cross-posted to: [email protected]
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square296fedilink cross-posted to: [email protected]
minus-squaresushibowl@feddit.nllinkfedilinkEnglisharrow-up26·7 months ago but is this prompt the entirety of what differentiates it from other GPT-4 LLMs? Yes. Probably 90% of AI implementations based on GPT use this technique. you can really have a product that’s just someone else’s extremely complicated product but you staple some shit to the front of every prompt? Oh yeah. In fact that is what OpenAI wants, it’s their whole business model: they get paid by gab for every conversation people have with this thing.
minus-squareelrik@lemmy.worldlinkfedilinkEnglisharrow-up2·7 months agoNot only that but the API cost is per token, so every message exchange in every conversation costs more because of the length of the system prompt.
Yes. Probably 90% of AI implementations based on GPT use this technique.
Oh yeah. In fact that is what OpenAI wants, it’s their whole business model: they get paid by gab for every conversation people have with this thing.
Not only that but the API cost is per token, so every message exchange in every conversation costs more because of the length of the system prompt.