AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law

  • bloopernova@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    1 year ago

    There is a potential for proliferation of CSAM generated by AI. While the big AI generators are centralized and kept clear of most bad stuff, eventually unrestricted versions will become widespread.

    We already have deepfake porn of popular actresses, which I think is already harmful. There’s also been sexually explicit deepfakes made of preteen and young teenage girls in Spain, and I think that’s the first of many similar incidents to come.

    I can’t think of a way to prevent this happening without destroying major potential in AI.

    • mindbleach
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      It is impossible for CSAM to be generated by AI.

      It is impossible for CSAM to be generated.

      You can’t sexually abuse children who don’t fucking exist.

      Please stop using the unambiguous term for actual photographs of real-life abuse, when referring to something a person made up, on a computer, alone.