doesn’t it follow that AI-generated CSAM can only be generated if the AI has been trained on CSAM?

This article even explicitely says as much.

My question is: why aren’t OpenAI, Google, Microsoft, Anthropic… sued for possession of CSAM? It’s clearly in their training datasets.

  • mindbleach
    link
    fedilink
    arrow-up
    1
    ·
    16 days ago

    Worth noting: it can also start with another image. A drawing, a photo, whatever. It will “denoise” that the same way, to better match the prompt.

    This is why it’s aggravating to explain to people, you cannot generate CSAM. It’s a contradiction. CSAM means photographic evidence of child rape. If that didn’t happen, there cannot be photos of it happening. But since you can do the digital equivalent of copy-pasting a real child’s face onto some naked woman, you technically almost sorta kinda aaaughhh. Like. It’s probably a crime? But it’s not the same kind of crime, for reasons I’d hope are obvious. But when some people use “CSAM” to refer to drawings of Bart Simpson, I wonder if language was a mistake.