cross-posted from: https://aussie.zone/post/2798829

According to prosecutors, Chail sent “thousands” of sexual messages to the chatbot, which was called Sarai on the Replika platform. Replika is a popular AI companion app that advertised itself as primarily being for erotic roleplay before eventually removing that feature and launching a separate app called Blush for that purpose. In chat messages seen by the court, Chail told the chatbot “I’m an assassin,” to which it replied, “I’m impressed.” When Chail asked the chatbot if it thought he could pull off his plan “even if [the queen] is at Windsor,” it replied, “smiles yes, you can do it.”

    • ᴇᴍᴘᴇʀᴏʀ 帝@feddit.ukOPM
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      The court had heard how Chail had a “significant history of trauma” and experienced psychotic episodes.

      But the case raises concerns over how people with mental illnesses or other issues interact with AI chatbots that may lack guardrails to prevent inappropriate interactions.