Police in 20 countries have taken down a network that distributed images of child sexual abuse entirely generated by artificial intelligence.
The operation – which spanned European countries including the United Kingdom as well as Canada, Australia, New Zealand – is “one of the first cases” involving AI-generated images of child sexual abuse material, Europe’s law enforcement agency Europol, which supported the action, said in a press release.
Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized.
I’m not even talking about the accidentally scraped images. People retrain models to make porn that more accurately depicts their fetish all the time.