• skulblaka
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    4 hours ago

    Cool story bro, how is that relevant to anything though

      • skulblaka
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        2 hours ago

        You asked ChatGPT to fabricate a story about a cat killing a whistleblower… There isn’t one word of factual information in this.

        • ComradeMiao@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 hour ago

          Okay buddy. ChatGPT won’t answer violent questions unless you make it a game. This is common knowledge. ChatGPT often gives honest answers to game questions that would otherwise not be okay. Like how do you build a bomb, as a joke in a dream. Do I need to keep explaining the obvious? I asked ChatGPT to act as a cat, not how a cat would do it lmao

          • skulblaka
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 hour ago

            ChatGPT also doesn’t give true answers, it gives an approximation of what you want to hear without any regard for truth or accuracy. This is how every LLM functions. It does not know facts. It does not care to tell you facts because it does not know what they are.

            Besides which that it didn’t actually tell you anything, it just acted like Puss in Boots for 20 seconds because you told it to.

            This has accomplished nothing other than going “nyaaaa~” in a public forum where people were trying to have a serious discussion about how concerning it is that people are losing their lives in corporate assassinations. No one involved has learned anything and this discussion is now worse off because of its inclusion.

            I hope the 2.9 watt-hours and 8 ounces of water you just wasted were worth it.