• otp
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    The term “computer (or digitally) generated child sexual abuse material” encompasses all forms of material representing children involved in sexual activities and/or in a sexualised manner, with the particularity that the production of the material does not involve actual contact abuse of real children but is artificially created to appear as if real children were depicted. It includes what is sometimes referred to as “virtual child pornography” as well as “pseudo photographs”.

    […]

    There is nothing “virtual” or unreal in the sexualisation of children, and these terms risk undermining the harm that children can suffer from these types of practices or the effect material such as this can have on the cognitive distortions of offenders or potential offenders. Therefore, terms such as “computer-generated child sexual abuse material” appear better suited to this phenomenon [than virtual child pornography].

    • Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse, section F.4.ii

    There’s a reputable source for the terminology usage.

    If you want to keep defending CG CSAM, take it up with the professionals

    • mindbleach
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      “The professionals are also full of shit” is not much of an argument.

      • otp
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        I’m going to hold the words of the people who are actually fighting against child exploration in much higher regard than someone who is defending AI-generated CSAM/CSEM. And honestly, I don’t understand why you’re defending it. It’s weirding me out…lol

        As I wrote in another comment,

        You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn’t mean anybody was killed.

        That’s all this is.

        • mindbleach
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          2 days ago

          That doesn’t mean anybody was killed.

          Or that any child was “explored.”

          I’m fucking disappointed that anyone professionally engaged in this wants to equate damning evidence of physical abuse with generic representation of the concept - for the exact reasons already described.

          There is an insurmountable difference between any depiction of a sex crime involving fictional children - and the actual sexual abuse of real living children. Fuck entirely off about throwing aspersions for why this distinction matters. If you don’t think child rape is fundamentally worse than some imagined depiction of same - fuck you.

          • otp
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            equate damning evidence of physical abuse with generic representation

            That’s not what it is.

            Just like AI-generated murder scenes are not being equated to physical evidence of someone having been murdered.

            I think you’re getting caught up in semantics. Can we at least agree that those AI-generated images are bad?

            • mindbleach
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 days ago

              “Generated sexual abuse” is explicitly being equated to actual child rape.

              These 25 people were not charged with thinkin’ real hard about the possibility of murder. The sting is described like they were caught doing some murder.

              • otp
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                The 25 people charged may have had other incriminating evidence against them.

                If you take issue with the law, take it up with the jurisdictions.

                If you think it should be perfectly okay for people to produce AI-generated CSEM, then I’m not really sure we can come to an agreement here.