• givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    The suit says that Setzer repeatedly expressed thoughts about suicide to the bot. The chatbot asked him if he had devised a plan for killing himself. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain. The chatbot allegedly told him: “That’s not a reason not to go through with it.”

    Yeah…

    They should be liable.

  • Grimy@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    1 day ago

    The kid was having issues, he was just given a diagnosis for anxiety and other stuff and he was seeing a therapist yet his parents left a loaded gun somewhere he could access and use within seconds.

  • TragicNotCute@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 days ago

    Dude made a hunch of bots based on Game of Thrones and “fell in love” with the mother of dragons.

    The complaint states that Garcia took her son’s phone away after he got in trouble at school. She found a message to “Daenerys” that read, “What if I told you I could come home right now?”

    The chatbot responded with, “[P]lease do, my sweet king.” Sewell shot himself with his stepfather’s pistol “seconds” later, the lawsuit said.