• mindbleach
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    21 hours ago

    You cannot generate CSAM. That’s the entire point of calling it CSAM!

    CSAM stands for “photographic evidence of child rape.” If there’s no child - then that didn’t happen, did it? Fictional crimes don’t tend to be identically illegal, for obvious reasons. Even if talking about murder can rise to the level of a crime - it’s not the same crime, because nobody died.

    distribution of images of minors fully generated by artificial intelligence

    What minors? You’re describing victims who are imaginary. Zero children were involved. They don’t fucking exist.

    We have to treat AI renderings the same way we treat drawings. If you honestly think Bart Simpson porn should be illegal - fine. But say that. Say you don’t think photographs of child rape are any worse than or different from a doodle of a bright yellow dick. Say you want any depiction of the concept treated as badly as the real thing.

    Because that’s what people are doing, when they use the language of child abuse to talk about a render.

    • otp
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      14 hours ago

      Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.

      And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can’t die, so they can’t produce corpses. But we can call them prosthetic corpses because they’re prosthetics to simulate real corpses.

      I’m curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.

      • Eezyville
        link
        fedilink
        English
        arrow-up
        6
        ·
        14 hours ago

        Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1’s and 0’s.

      • mindbleach
        link
        fedilink
        English
        arrow-up
        4
        ·
        14 hours ago

        Yeah can’t imagine why evidence of child rape would deserve special consideration. We only invented the term CSAM, as opposed to CP, specifically to clarify when it is real, versus the make-believe of ‘wow, this would be bad, if it wasn’t made up.

        Do you think CGI of a dead body is the same as murder?

        That’d be bad if it was real! It’d be murder! But it - fucking - isn’t, because it’s not real.

        I must underline, apparently: the entire reason for using the term “child sexual abuse material” is to describe material proving a child was sexually abused. That’s kinda fucking important. Right? That’s bad in the way that a human person dying is bad. If you treated James Bond killing someone, in a movie, the same way you treated an actual human person dying, people would think you’re insane. Yet every fucking headline about these renders uses the same language as if typing words into a program somehow fucked an actual child.