Europol has supported authorities from 19 countries in a large-scale hit against child sexual exploitation that has led to 25 arrests worldwide. The suspects were part of a criminal group whose members were engaged in the distribution of images of minors fully generated by artificial intelligence (AI).
What would stop someone from creating a tool that tagged real images as AI generated?
Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.
Some form of digital signatures for allowed services?
Sure, it will limit the choice of where to legally generate content, but it should work.
I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.
Open-source models exist and can be forked
…and then we’re back at “someone can take that model and tag real images to appear AI-generated.”
You would need a closed-source model run server-side in order to prevent that.
Yep, essentially. But that’s for the hyperrealistic one.