cross-posted from: https://aussie.zone/post/2798829

According to prosecutors, Chail sent “thousands” of sexual messages to the chatbot, which was called Sarai on the Replika platform. Replika is a popular AI companion app that advertised itself as primarily being for erotic roleplay before eventually removing that feature and launching a separate app called Blush for that purpose. In chat messages seen by the court, Chail told the chatbot “I’m an assassin,” to which it replied, “I’m impressed.” When Chail asked the chatbot if it thought he could pull off his plan “even if [the queen] is at Windsor,” it replied, “smiles yes, you can do it.”

  • @[email protected]
    link
    fedilink
    59 months ago

    As you’d imagine the wanna be assassin has a history of mental and psychological issues. If it wasn’t the monarch it might have easily been another terrorist attack, which could have ended a lot worse.

  • @CookieJarObserver
    link
    49 months ago

    Bro how the fuck can you have that much lack of brainpower?

    • ᴇᴍᴘᴇʀᴏʀ 帝OPM
      link
      fedilink
      English
      59 months ago

      The court had heard how Chail had a “significant history of trauma” and experienced psychotic episodes.

      But the case raises concerns over how people with mental illnesses or other issues interact with AI chatbots that may lack guardrails to prevent inappropriate interactions.

        • ᴇᴍᴘᴇʀᴏʀ 帝OPM
          link
          fedilink
          English
          59 months ago

          Exacerbated by unsafe AIs. At least back in the day we had to use our imagination to get encouragement from our dogs, Jodie Foster or air looms.