Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • otp
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    24 days ago

    Would it harm you to have identifiable nude photos of you available for download on the internet?

    Would it harm you to have identifiable nude photos of you being used to train AI so that it can create more nude images that are “inspired” by your nude images?

    Would you be happy to upload your children’s nude photos so that people on the internet can share them and masturbate to them? Would you be harmed if your parents had done that with your images?

    • Cryophilia@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      23 days ago

      Most AI generated images are not of real, identifiable people. I agree that deepfake porn is bad, whether of a child or adult, but that’s a separate category.

      • otp
        link
        fedilink
        English
        arrow-up
        1
        ·
        22 days ago

        You’re definitely right, and I’m aware. The smaller the sample size, though, the more likely an AI art generator would create something that looks very similar to a given individual.

        As well, some AI art generators accept prompt images to use as a starting point.

        • Cryophilia@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          22 days ago

          Ok but that’s a pretty niche thing to be worried about, is my point. You can’t apply that broadly to all AI porn.

          • otp
            link
            fedilink
            English
            arrow-up
            1
            ·
            22 days ago

            Not so much when it comes to prompt images.

    • Mango@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      6
      ·
      24 days ago

      As a child? No. In fact, I can milk that for pity money. As an adult, I can’t see how it matters. I don’t like it, but it doesn’t hurt me any.

      Also definitely no.

      Again, double no.

      • otp
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        24 days ago

        To clarify, the second last question about your children was “would you be happy to …”

        If you wouldn’t be happy to, then why not?

        And if you would be happy to do that, then why? Lol

        • Mango@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          24 days ago

          You got me there. It’s definitely weird and gross and therefore no. That’s harm enough, but that’s more a matter of it being published and real. This dude doing it for himself is hardly different to me from fantasizing in your head or drawing in your sketchbook. That said, what was his AI training material? He’s also doing this for other people and encouraging rape and shit.

          • otp
            link
            fedilink
            English
            arrow-up
            3
            ·
            23 days ago

            What makes it different than imagining it or drawing it is that the AI is using real photos as training material. If the parents are knowingly providing images, that’s questionable. If the AI is discovering CSAM images, that’s horrible. If it’s using non-CSAM images of children without the knowing consent of the parents, that’s pretty bad too.

            • Mango@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              23 days ago

              How is AI using real photos any different from a person using their real memory?

              • otp
                link
                fedilink
                English
                arrow-up
                2
                ·
                23 days ago

                Because the AI publishes what it creates based on those images. The AI also doesn’t have imagination the way that a person does. It could accidentally create CSAM material with a child that looks exactly like someone’s child. And it can generate images that look like photos. Someone sketching something from memory can’t do that.

                • Mango@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  23 days ago

                  AI doesn’t have to publish, and also that doesn’t make it any different from drawing. I don’t think the CP is accidental. Someone with enough skill can absolutely do that.

                  • otp
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    arrow-down
                    1
                    ·
                    edit-2
                    23 days ago

                    Sorry, I meant it could create CSAM that, by accident, looks exactly like one of the source children.

                    AI “publishes” whenever it gives something to the user.

                    Drawing is different from AI art because AI art can look like photographs.