• Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    6 hours ago

    Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

    I get how fucking creepy and downright sickening this all feels, but I’m genuinely surprised that it’s illegal or criminal if there’s no actual children involved.

    It mentions sexual extortion and that’s definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

  • BrianTheeBiscuiteer@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    18 hours ago

    On one hand I don’t think this kind of thing can be consequence free (from a practical standpoint). On the other hand… how old were the subjects? You can’t look at a person to determine their age and someone that looks like a child but is actually adult wouldn’t be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.

    This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.

    • General_Effort@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      2 hours ago

      It’s not a gray area at all. There’s an EU directive on the matter. If an image appears to depict someone under the age of 18 then it’s child porn. It doesn’t matter if any minor was exploited. That’s simply not what these laws are about.

      Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It’s not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

      17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that’s possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

      Anyway, what I’m saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

      • duisgur
        link
        fedilink
        English
        arrow-up
        1
        ·
        26 minutes ago

        Legality is not the same as morality.

      • mindbleach
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        Even people who want this sort of image banned need to stop describing it like the real thing. These headlines drive me up the wall - like seeing people arrested for “simulated murder.”

        • mindbleach
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          Walk me through how any rendering “borders on” proof that a child was raped.

    • mindbleach
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      13 hours ago

      There are no subjects. It’s image gloop. You cannot exploit or harm JPEG artifacts.

      Think of it as a drawing. If you’re sketching from life… yeah, jail. Otherwise what are we even talking about?

  • mindbleach
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    18 hours ago

    You cannot generate CSAM. That’s the entire point of calling it CSAM!

    CSAM stands for “photographic evidence of child rape.” If there’s no child - then that didn’t happen, did it? Fictional crimes don’t tend to be identically illegal, for obvious reasons. Even if talking about murder can rise to the level of a crime - it’s not the same crime, because nobody died.

    distribution of images of minors fully generated by artificial intelligence

    What minors? You’re describing victims who are imaginary. Zero children were involved. They don’t fucking exist.

    We have to treat AI renderings the same way we treat drawings. If you honestly think Bart Simpson porn should be illegal - fine. But say that. Say you don’t think photographs of child rape are any worse than or different from a doodle of a bright yellow dick. Say you want any depiction of the concept treated as badly as the real thing.

    Because that’s what people are doing, when they use the language of child abuse to talk about a render.

    • otp
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      11 hours ago

      Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.

      And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can’t die, so they can’t produce corpses. But we can call them prosthetic corpses because they’re prosthetics to simulate real corpses.

      I’m curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.

      • Eezyville
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 hours ago

        Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1’s and 0’s.

      • mindbleach
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 hours ago

        Yeah can’t imagine why evidence of child rape would deserve special consideration. We only invented the term CSAM, as opposed to CP, specifically to clarify when it is real, versus the make-believe of ‘wow, this would be bad, if it wasn’t made up.

        Do you think CGI of a dead body is the same as murder?

        That’d be bad if it was real! It’d be murder! But it - fucking - isn’t, because it’s not real.

        I must underline, apparently: the entire reason for using the term “child sexual abuse material” is to describe material proving a child was sexually abused. That’s kinda fucking important. Right? That’s bad in the way that a human person dying is bad. If you treated James Bond killing someone, in a movie, the same way you treated an actual human person dying, people would think you’re insane. Yet every fucking headline about these renders uses the same language as if typing words into a program somehow fucked an actual child.