Character AI has filed a motion to dismiss a case brought against it by the parent of a teen who committed suicide allegedly after becoming hooked on the company's technology.
It’s up to the user to understand it’s a fantasy and not reality.
I believe even non-AI media could be held liable if it encouraged suicide. It doesn’t seem like much of a leap to say, “This is for entertainment purposes only,” and follow with a long series of insults and calls to commit suicide. If two characters are taking to each other and encourages self-harm then that’s different. The encouragement is directed at another fictional character, not the viewer.
Many video games let you do violent things to innocent npcs.
NPCs, exactly. Do bad things to this collection of pixels, not people in general. The immersion factor would also play in favor of the developer. In a game like Postal you kill innocent people but you’re given a setting and a persona. “Here’s your sandbox. Go nuts!” The chat system in question is meant to mimic real chatting with real people. It wasn’t sending messages within a GoT MMO or whatnot.
Llms are quickly going to be included in video games and I would rather not have safeguards (censorship) because a very small percentage of people with clear mental issues can’t deal with them.
There are lots of ways to include AI in games without it generating voice or text. Even so that’s going to be much more than a chat system. If Character AI had their act together I bet they’d offer the same service as voice chat even. This service was making the real world the sandbox!
I believe even non-AI media could be held liable if it encouraged suicide. It doesn’t seem like much of a leap to say, “This is for entertainment purposes only,” and follow with a long series of insults and calls to commit suicide. If two characters are taking to each other and encourages self-harm then that’s different. The encouragement is directed at another fictional character, not the viewer.
NPCs, exactly. Do bad things to this collection of pixels, not people in general. The immersion factor would also play in favor of the developer. In a game like Postal you kill innocent people but you’re given a setting and a persona. “Here’s your sandbox. Go nuts!” The chat system in question is meant to mimic real chatting with real people. It wasn’t sending messages within a GoT MMO or whatnot.
There are lots of ways to include AI in games without it generating voice or text. Even so that’s going to be much more than a chat system. If Character AI had their act together I bet they’d offer the same service as voice chat even. This service was making the real world the sandbox!