AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law

  • mindbleach
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    There is no such thing as generated CSAM.

    That is the ENTIRE POINT of calling it CSAM.

    I don’t care how badly you want to crack down on drawings, or renders, or AI hallucinations. Stop using the same label as photographic evidence of child rape. A label that was specifically chosen to be obviously wrong when applied to fictional fucking characters.

    You can’t abuse a child who does not exist!