• @[email protected]
    link
    fedilink
    72 months ago

    There’s also jailbreaking the AI. If you happen to work for a trollfarm, you have to be up to date with the newest words to bypass its community guidelines to make it “disprove” anyone left of Mussolini.

    • @threelonmusketeers
      link
      English
      22 months ago

      I tried some of the popular jailbreaks for ChatGPT, and they just made it hallucinate more.

    • ferret
      link
      English
      22 months ago

      You can skip that bullshit and just run the latest and greatest open source model locally. Just need a thousand dollar gpu