• otp
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    15 小时前

    Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.

    And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can’t die, so they can’t produce corpses. But we can call them prosthetic corpses because they’re prosthetics to simulate real corpses.

    I’m curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.

    • Eezyville
      link
      fedilink
      English
      arrow-up
      6
      ·
      14 小时前

      Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1’s and 0’s.

    • mindbleach
      link
      fedilink
      English
      arrow-up
      4
      ·
      14 小时前

      Yeah can’t imagine why evidence of child rape would deserve special consideration. We only invented the term CSAM, as opposed to CP, specifically to clarify when it is real, versus the make-believe of ‘wow, this would be bad, if it wasn’t made up.

      Do you think CGI of a dead body is the same as murder?

      That’d be bad if it was real! It’d be murder! But it - fucking - isn’t, because it’s not real.

      I must underline, apparently: the entire reason for using the term “child sexual abuse material” is to describe material proving a child was sexually abused. That’s kinda fucking important. Right? That’s bad in the way that a human person dying is bad. If you treated James Bond killing someone, in a movie, the same way you treated an actual human person dying, people would think you’re insane. Yet every fucking headline about these renders uses the same language as if typing words into a program somehow fucked an actual child.