Police in 20 countries have taken down a network that distributed images of child sexual abuse entirely generated by artificial intelligence.

The operation – which spanned European countries including the United Kingdom as well as Canada, Australia, New Zealand – is “one of the first cases” involving AI-generated images of child sexual abuse material, Europe’s law enforcement agency Europol, which supported the action, said in a press release.

Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized.

  • mindbleach
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    2 days ago

    You can’t… generate… abuse.

    No more than you can generate murder.

    The entire point of saying “child abuse images” is to distinguish evidence of rape from, just, drawings.

    If you want drawings of this to also be illegal, fine, great, say that. But stop letting people use the language of actual real-world molestation of living human children, when describing some shit a guy made up alone.

    • Q*Bert Reynolds
      link
      fedilink
      arrow-up
      6
      arrow-down
      3
      ·
      2 days ago

      How did they train the model? I’d say it’s just as problematic if the generator was trained using CSAM.

      • mindbleach
        link
        fedilink
        arrow-up
        11
        ·
        2 days ago

        How can anyone believe these models have a big pile of hyper-illegal go-to-jail images, labeled specifically for word-to-image training?

        This technology combines concepts. That’s why it’s a big fucking deal. Generating a thousand images of Darth Shrektopus riding a horse on the moon does not imply a single example of exactly that. The model knows what children are - the model knows what pornography is. It can satisfy those unrelated concepts, simultaneously.

          • mindbleach
            link
            fedilink
            arrow-up
            2
            ·
            2 days ago

            The best punchline would be if you manually Photoshopped this together circa 2012.

            And the only clear thing to suggest it’s not that, is the middle left tentacle.

        • Q*Bert Reynolds
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          2 days ago

          Because that’s exactly what happens. The vast majority of AI porn isn’t coming from unmodified models. People use LoRAs and their massive porn collections to retrain and fine tune models to generate their exact fetish. Pedos are definitely doing that too.

      • jol@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Some what’s agree, but imagine models are able to combine concepts to generate semi-novel images. You wouldn’t need specifically child porn to generate child porn. Images of children, and images of porn, can be combine to create child porn.

        But considering how indiscriminate AI crawlers have been, you can bet the training data does contain lots of it…

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Theoretically you should be able to generate it by cobbling together legal images.

        But given the massive volume of scraped data, they’ve also ended up with actual CSAM in their training data. I recall seeing articles about them having to identify and remove it, but I’m definitely not adding that search to my history.

        • Q*Bert Reynolds
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          I’m not even talking about the accidentally scraped images. People retrain models to make porn that more accurately depicts their fetish all the time.

        • Bob Robertson IX @discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          I find the legal aspect of it fascinating because proving the age of a computer generated boob is impossible unless you have access to the actual prompt used to create it, and that prompt specifically asks for an image of CSAM. And even then, who’s to say the system didn’t use create most of the image from legal training data, and just had underage facial features. It’ll be interesting to see how they prosecute and defend against these charges.