• ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    7 months ago

    There’s also jailbreaking the AI. If you happen to work for a trollfarm, you have to be up to date with the newest words to bypass its community guidelines to make it “disprove” anyone left of Mussolini.

    • threelonmusketeers
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      I tried some of the popular jailbreaks for ChatGPT, and they just made it hallucinate more.

    • ferret
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      You can skip that bullshit and just run the latest and greatest open source model locally. Just need a thousand dollar gpu