Xatolos@reddthat.com to Technology@lemmy.worldEnglish · 6 months ago'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedwww.tomshardware.comexternal-linkmessage-square11fedilinkarrow-up1120arrow-down18cross-posted to: becomeme
arrow-up1112arrow-down1external-link'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedwww.tomshardware.comXatolos@reddthat.com to Technology@lemmy.worldEnglish · 6 months agomessage-square11fedilinkcross-posted to: becomeme
minus-squaretomas@lm.eke.lilinkfedilinkEnglisharrow-up1·edit-25 months agosummary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.
summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.