• @[email protected]
    link
    fedilink
    222 months ago

    It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It’s anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

    • @[email protected]
      link
      fedilink
      192 months ago

      It’s also anarchist because it is telling people to stop doing the things they’ve been instructed to do.

    • @[email protected]
      link
      fedilink
      42 months ago

      It’s not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

    • Smorty [she/her]
      link
      fedilink
      2
      edit-2
      2 months ago

      Yeah, that’s what I referred to. I’m aware of DAN and it’s friends, personally I like to use Command R+ for its openness tho. I’m just wondering if that’s the funi in this post.