AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law
AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law
There is no such thing as generated CSAM.
That is the ENTIRE POINT of calling it CSAM.
I don’t care how badly you want to crack down on drawings, or renders, or AI hallucinations. Stop using the same label as photographic evidence of child rape. A label that was specifically chosen to be obviously wrong when applied to fictional fucking characters.
You can’t abuse a child who does not exist!