Character AI has filed a motion to dismiss a case brought against it by the parent of a teen who committed suicide allegedly after becoming hooked on the company's technology.
Precisely. Yet so many LLMs make outrageous claims, or at least fail to make the limitations obvious.
My point is that it’s not on the user to see past the BS, it’s on the provider of the service. The company’s argument is that they’re not responsible because computer code is protected by the first amendment. I think that misses the whole issue, which is that users may not be made sufficiently aware of the limitations and dangers of the service.
I service can only do so much. Some folks are just dumb or mentally unwell. The question is did they do enough to communicate the limitations of AI. Free speech is the wrong argument. I think we are in agreement other than it sounds like maybe you are assuming they didn’t communicate that well enough and I’m assuming they did. That’s what the court case should be about.
Precisely. Yet so many LLMs make outrageous claims, or at least fail to make the limitations obvious.
My point is that it’s not on the user to see past the BS, it’s on the provider of the service. The company’s argument is that they’re not responsible because computer code is protected by the first amendment. I think that misses the whole issue, which is that users may not be made sufficiently aware of the limitations and dangers of the service.
I service can only do so much. Some folks are just dumb or mentally unwell. The question is did they do enough to communicate the limitations of AI. Free speech is the wrong argument. I think we are in agreement other than it sounds like maybe you are assuming they didn’t communicate that well enough and I’m assuming they did. That’s what the court case should be about.