cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

      • Jimmyeatsausage@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        3
        ·
        6 months ago

        It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

        You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.

        • JackGreenEarth@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          4
          ·
          6 months ago

          Do you not know that CSAM is an acronym that stands for child sexual abuse material?

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            6 months ago

            True but CSAM is anything that involves minors. Its really up to the court to decide a lot of it but in the case above I’d imagine that the images were quite disturbing.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            6 months ago

            I think the court looked at the phycological aspects of it. When you look at that kind of material you are training your brain and body to be attracted to that stuff in real life.

    • Reddfugee42@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 months ago

      We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”

      I can only draw from this that your morality is based on laws instead of vice versa.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        6 months ago

        I’m in the camp if that there is no reason that you should have that kind of imagery especially AI generated imagery. Think about what people often do with pornography. You do not want them doing that with children regardless of if it is AI generated.

        • Reddfugee42@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          What does want have to do with it? I’d rather trust science and psychologists to determine if this, which is objectively harmless, helps them control their feelings and gives them a harmless outlet.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            They aren’t banning porn in general. They just don’t want to create any more sexual desires toward children. The CSAM laws came from child protection experts. Admittedly some of these people want to “ban” encryption but that’s irrelevant in this case.

    • mindbleach
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      There was no C.

      There was no SA.

      The entire point of saying “CSAM” was to distinguish evidence of child rape from depictions of imaginary events.