• peanuts4life@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    61
    ·
    6 months ago

    It’s worth mentioning that in this instance the guy did send porn to a minor. This isn’t exactly a cut and dry, “guy used stable diffusion wrong” case. He was distributing it and grooming a kid.

    The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.

    For example, websites like novelai make a business out of providing pornographic, anime-style image generation. The models they use deliberately tuned to provide abstract, “artistic” styles, but they can generate semi realistic images.

    Now, let’s say a criminal group uses novelai to produce CSAM of real people via the inpainting tools. Let’s say the FBI cast a wide net and begins surveillance of novelai’s userbase.

    Is every person who goes on there and types, “Loli” or “Anya from spy x family, realistic, NSFW” (that’s an underaged character) going to get a letter in the mail from the FBI? I feel like it’s within the realm of possibility. What about “teen girls gone wild, NSFW?” Or “young man, no facial body hair, naked, NSFW?”

    This is NOT a good scenario, imo. The systems used to produce harmful images being the same systems used to produce benign or borderline images. It’s a dangerous mix, and throws the whole enterprise into question.

    • ricecake
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      6 months ago

      The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.

      https://www.ic3.gov/Media/Y2024/PSA240329 https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography

      They’ve actually issued warnings and guidance, and the law itself is pretty concise regarding what’s allowed.

      (8) “child pornography” means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where-

      (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct;

      (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or

      © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

      (11) the term “indistinguishable” used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.

      https://uscode.house.gov/view.xhtml?hl=false&edition=prelim&req=granuleid%3AUSC-prelim-title18-section2256&f=treesort&num=0

      If you’re going to be doing grey area things you should do more than the five minutes of searching I did to find those honestly.

      It was basically born out of a supreme Court case in the early 2000s regarding an earlier version of the law that went much further and banned anything that “appeared to be” or “was presented as” sexual content involving minors, regardless of context, and could have plausibly been used against young looking adult models, artistically significant paintings, or things like Romeo and Juliet, which are neither explicit nor vulgar but could be presented as involving child sexual activity. (Juliet’s 14 and it’s clearly labeled as a love story).
      After the relevant provisions were struck down, a new law was passed that factored in the justices rationale and commentary about what would be acceptable and gave us our current system of “it has to have some redeeming value, or not involve actual children and plausibly not look like it involves actual children”.

    • mindbleach
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      The part involving actual children is go-to-jail territory, because it involves actual children.

      The images don’t.

      There’s as much “child sexual abuse” in a generated image as in a crayon drawing of Bart Simpson fucking Lisa. Anyone’s visceral reaction may differ between the two JPGs, but we should treat them the same because they are the same.

    • PirateJesus@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      9
      ·
      6 months ago

      The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.

      The Protect Act of 2003 means that any artistic depiction of CSAM is illegal. The guidance is pretty clear, FBI is gonna raid your house…eventually. We still haven’t properly funded the anti-CSAM departments.

    • retrieval4558@mander.xyz
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      28
      ·
      6 months ago

      Is every person who goes on there and types, “Loli” or “Anya from spy x family, realistic, NSFW” (that’s an underaged character) going to get a letter in the mail from the FBI?

      I’ll throw that baby out with the bathwater to be honest.

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        38
        arrow-down
        4
        ·
        edit-2
        6 months ago

        Simulated crimes aren’t crimes. Would you arrest every couple that finds health ways to simulate rape fetishes? Would you arrest every person who watches Fast and The Furious or The Godfather?

        If no one is being hurt, if no real CSAM is being fed into the model, if no pornographic images are being sent to minors, it shouldn’t be a crime. Just because it makes you uncomfortable, don’t make it immoral.

        • Ookami38
          link
          fedilink
          English
          arrow-up
          27
          ·
          6 months ago

          Or, ya know, everyone who ever wanted to decapitate those stupid fucking Skyrim children. Crime requires damaged parties, and with this (idealized case, not the specific one in the article) there is none.

          • DarkThoughts@fedia.io
            link
            fedilink
            arrow-up
            11
            arrow-down
            1
            ·
            6 months ago

            Those were demon children from hell (with like 2 exceptions maybe). It was a crime by Bethesda to make them invulnerable / protected by default.

        • helpImTrappedOnline@lemmy.world
          link
          fedilink
          English
          arrow-up
          22
          ·
          6 months ago

          Simulated crimes aren’t crimes.

          If they were, any one who’s played games is fucked. I’m confident everyone who has played went on a total ramapage murdering the townfolk, pillaging their houses and blowing everything up…in Minecraft.

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          They would though. We know they would because conservatives already did the whole laws about how you can have sex in private thing.

        • PirateJesus@lemmy.today
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          12
          ·
          6 months ago

          Simulated crimes aren’t crimes.

          Artistic CSAM is definitely a crime in the United States. PROTECT act of 2003.

          • Duamerthrax@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            6 months ago

            People have only gotten in trouble for that when they’re already in trouble for real CSAM. I’m not terrible interested in sticking up for actual CSAM scum.

          • mindbleach
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Directly contradicted by the Ashcroft decision two years prior.

            • Meansalladknifehands@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              6 months ago

              For now, if you read the article, it states that he shared the pictures to form like minded groups where they got emboldened and could support each other and legitimize/normalize their perverted thoughts. How about no thanks.

              • Lowlee Kun@feddit.de
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                6 months ago

                wrong comment chain. people weren’t talking about the criminal shithead the article is about but about the scenario of someone using (not csam trained) models to create questionable content (thus it is implied that there would be no victim). we all know that there are bad actors out there, just like there are rapists and murderers. still we dont condemn true crime lovers or rape fetishists until they commit a crime. we could do the same with pedos but somehow we believe hating them into the shadows will stop them somehow from doing criminal stuff?

                • Meansalladknifehands@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  6 months ago

                  And I’m using the article as an example of that it doesn’t just stop at “victimless” images, because they are not fucking normal people. They are mentally sick, they are sexually turned on by the abuse of a minor, not by the minor but by abusing the minor, sexually.

                  In what world would a person like that stop at looking at images, they actively search for victims, create groups where they share and discuss abusing minors.

                  Yes dude, they are fucking dangerous bro, life is not fair. You wouldn’t say the same shit if some one close to you was a victim.

              • Duamerthrax@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                6 months ago

                Maybe you should focus your energy on normalized things that actually effect kids like banning full contact sports that cause CTE.

                • Meansalladknifehands@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 months ago

                  What do you mean focus your energy, how much energy do you think I spend on discussing perverts? And what should I spend my time discussing contact sports. It’s sound like you are deflecting.

                  Pedophiles get turned on abusing minors, they are mentally sick. It’s not like its a normal sexual desire, they will never stop at watching “victimless” images. Fuck pedophiles they don’t deserve shit, and hope they eat shit he rest of their lives.

                  • Duamerthrax@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    6 months ago

                    they will never stop at watching “victimless” images.

                    How is that different from any other dangerous fetish? Should we be arresting adult couples that do Age Play? All the BDSM communities? Do we even want to bring up the Vore art communities? Victimless is victimless.

                  • Wes4Humanity@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    6 months ago

                    You’re correct, pedophilia is a mental illness. A very tragic one since there is no hope and no cure. They can’t even ask for help because everyone will automatically assume they are also child molesters. Which is what you’re describing, but not all child molesters are pedophiles, and most pedophiles will never become child molesters… Like you said, some people just get off on exploiting the power dynamic and aren’t necessarily sexually attracted to children. Those people are the real danger.

            • PotatoKat@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              6 months ago

              Real children are in training data regardless of if there is csam in the data or not (which there is a high chance there is considering how they get their training data) so real children are involved

              • Duamerthrax@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                6 months ago

                I’ve already stated that I do not support using images of real children in the models. Even if the images are safe/legal, it’s a violation of privacy.

          • Maggoty@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 months ago

            Nobody is arguing that it’s moral. That’s not the line for government intervention. If it was then the entire private banking system would be in prison.