Behold - the only way made-up images could be CSAM. This fucker referenced actual children. (Worse: children he has access to, in a relationship of trust. Doing this for patients of any age is find-a-new-job territory, bare minimum.)
But AI is irrelevant here. It would be equally criminal to use scissors and glue, or MS Paint.
And I’m horrified by how many people think AI needs examples of the exact thing it’s generating. People - it distills concepts, from labeled images. It can combine concepts. Like “steampunk space shuttle.” Or “Little Tykes F1 car.” Or “Margaret Thatcher naked.” Even if no such image is in the training data. Even if (god willing) no such image exists.
Yes, AI can generate an image that is both a child and naked, even if it only has examples of clothed children and naked adults. It can generate an image that is both a horse and a hearse. Do you think it needs a real that’s both at once, to generate a hallucination that’s both at once?
Behold - the only way made-up images could be CSAM. This fucker referenced actual children. (Worse: children he has access to, in a relationship of trust. Doing this for patients of any age is find-a-new-job territory, bare minimum.)
But AI is irrelevant here. It would be equally criminal to use scissors and glue, or MS Paint.
And I’m horrified by how many people think AI needs examples of the exact thing it’s generating. People - it distills concepts, from labeled images. It can combine concepts. Like “steampunk space shuttle.” Or “Little Tykes F1 car.” Or “Margaret Thatcher naked.” Even if no such image is in the training data. Even if (god willing) no such image exists.
Yes, AI can generate an image that is both a child and naked, even if it only has examples of clothed children and naked adults. It can generate an image that is both a horse and a hearse. Do you think it needs a real that’s both at once, to generate a hallucination that’s both at once?