• ricecake
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    In many ways they are. The image generated from a prompt isn’t unique, and is actually semi random. It’s not entirely in the users control. The person could argue “I described what I like but I wasn’t asking it for children, and I didn’t think they were fake images of children” and based purely on the image it could be difficult to argue that the image is not only “child-like” but actually depicts a child.

    The prompt, however, very directly shows what the user was asking for in unambiguous terms, and the negative prompt removes any doubt that they thought they were getting depictions of adults.

    • PirateJesus@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      And also it’s an AI.

      13k images before AI involved a human with Photoshop or a child doing fucked up shit.

      13k images after AI is just forgetting to turn off the CSAM auto-generate button.

    • mindbleach
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      To which we have to ask:

      And?

      You can ask for images of murder, and be very specific that you want murder, and that’s not the same thing as killing someone.

      You can ask for pornography involving blood and guts. No human persons will be harmed by it.

      You can ask for the most misogynistic, racist, outright Nazi bullshit, and it’s still a fantasy of linear algebra.

      Why do we give a shit about depictions? Depictions are not real.

      • ricecake
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        And that’s a valid viewpoint.

        Many people argue, and I tend to agree, that the availability of any depiction serves to both increase demand, and they serve to muddy the waters and make it more difficult to distinguish photographic depictions from generated depictions.
        Outlawing generated depictions alongside photographic depictions keeps the supply low, keeps the form of content clearly in the “do not seek” category, and prevents attempts to camouflage or launder photographic depictions through generation systems.
        That’s why the law is written to cover not only actual photographs, but any sufficiently detailed representation as well.

        • mindbleach
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Do you then burn Nabakov’s Lolita? Or the million overtly pornographic stories online, merely describing these acts? Any stick-figure drawing with the names of fictional characters could “increase demand” by communicating the idea. If that demand can be satisfied with… more stick-figure drawings… I’m not sure why we should give a shit.

          The first amendment doesn’t make exceptions for expression that is too good. If someone paints a picture that approaches photorealism, there’s not some cutoff where we treat it like a photograph, regardless of what it’s a painting of. It can never be the same thing. It is unreasonable for a government to pretend otherwise. Even if the painting violates the law in some other way, as a threat or defamation or whatever, we still don’t consider evidence of the event depicted, because the event did not happen.

          This is vegan leather. This is an approximation of something that’s theoretically violent and disgusting, but was produced without harming any living creature. It is an ethical counterfeit.

          • ricecake
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            You’re correct that the first amendment doesn’t have a quality exception, but it does have exceptions in general.
            While first amendment protections are very important, critically so, there is also a real interest in preventing or discouraging child abuse.

            https://www.ic3.gov/Media/Y2024/PSA240329 https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography https://supreme.justia.com/cases/federal/us/535/234/

            These issues have been discussed by the courts before, and that’s how we came to the line that it has to be a visual depiction virtually indistinguishable from a real child engaged in explicitly pornographic conduct with no redeeming artistic or cultural value, the policy clarified by the FBI in the first link.
            In the supreme Court case linked, the opinion (that overturned an earlier overbroad law) brought up many of the points you do, and those were their reasons for finding the way they did.
            But they also conceeded that the government does have a compelling interest in suppressing child abuse and the interest in child abuse and that a sufficiently narrow restriction would be permissible under the first amendment.

            So for your hypotheticals, the answer would be “no”, because they either have a redeeming artistic value, are trivially distinguished from an actual photograph, aren’t visual depictions or all of the above.
            You’re right that there’s no cutoff where a painting becomes treated like a photograph, but they can and did define a line for images where we can "treat it like a photograph:

            the term “indistinguishable” used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.

            • mindbleach
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              7 months ago

              that’s how we came to the line

              … is it, though?

              The line only exists because puritan busybodies banned pornography entirely. Their arguments were a mix of dogma and making shit up. So, two kinds of making shit up.

              It’s been trimmed back by judges admitting we’re not supposed to do that, but making excuses to keep doing it anyway. The Roth standard was literally “I know it when I see it.” The Miller standard just pinned down a pretense for that sentiment.

              Even the Ferber decision, specifically concerning actual CSAM, was about obscenity instead of focusing on the abuse. It took until Osborne for the court to really go ‘oh, right, the kids.’

              This whole area of American law is deeply gross because we’re coming at it ass-backwards. Free speech does not need a purpose! Free speech does not need value! It is innately irrepressible, until it causes direct harm. Offense is not harm. I can talk shit in the most G-rated way and make people understandably think about strangling me. Or I can list every swear word and slur that I know before inventing some of my own. Both are protected - or goddamn well ought to be. I can get booted off a website or told to leave the drive-through, but the government’s not gonna come arrest me for mere frothing bigotry, however objectively value-less that ranting and raving might be.

              The Ashcroft standard still cares equally about obscenity and the existence of actual victims. Despite obscenity being complete horseshit. And even then, the upheld decision pointed out, speech is protected unless it causes direct harm. As Wikipedia neatly puts it: the government could not prohibit speech merely because of its tendency to persuade its viewers to engage in illegal activity.

              Now the current standard is… you’re not allowed to draw too good or render too fancy… because it might make the government’s job harder? We’re gonna look back on that as bluntly as puritanical nonsense about how vulgar stories turn men into insatiable beasts. The battle is half-over thanks to clarifying CSAM versus CP. No C? No SA. The other half is reminding people that fictional violence doesn’t become “interest in real violence” and normal pornography has yet to ruin society.