• mindbleach
    link
    fedilink
    arrow-up
    33
    arrow-down
    4
    ·
    1 year ago

    There is no such thing.

    God dammit, the entire point of calling it CSAM is to distinguish photographic evidence of child rape from made-up images that make people feel icky.

    If you want them treated the same, legally - go nuts. Have that argument. But stop treating the two as the same thing, and fucking up clear discussion of the worst thing on the internet.

    You can’t generate assault. It is impossible to abuse children who do not exist.

    • m0darn@lemmy.ca
      link
      fedilink
      arrow-up
      30
      ·
      1 year ago

      Did nobody in this comment section read the video at all?

      The only case mentioned by this video is a case where highschool students distributed (counterfeit) sexually explicit images of their classmates which had been generated by an AI model.

      I don’t know if it meets the definition of CSAM because the events depicted in the images are fictional, but the subjects are real.

      These children do exist, some have doubtlessly been traumatized by this. This crime has victims.

    • rurutheguru@lemmings.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      I think a lot of people are arguing that the models which are used to generate these types of content are trained on literal CSAM. So it’s like CSAM with extra steps.

    • crispy_kilt@feddit.de
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      In most (all?) countries no such distinction is made, the material is illegal all the same.