A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
Obligatory nitpick: you can’t generate CSAM. That’s not what “CSAM” means. The entire point of calling it that is to distinguish actual pictures of actual children suffering real-life sexual abuse… versus things that didn’t happen.
It’s like someone claiming “I’ll generate proof of murder.” You sure won’t, bucko. Whatever gross image you produce - it will never be that. Especially if the guy in the image is still walking around.
Ahaha true!
Clarifying: my copied comment used “generate” to mean “produce”.
In any case your point is well taken. Something to think about.
Ahh, yeah, I guess that’d be the definition used prior to… all of this.
Though doing that on-demand for ten bucks is a whole different kind of fucked up.