Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • Cryophilia@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Most AI generated images are not of real, identifiable people. I agree that deepfake porn is bad, whether of a child or adult, but that’s a separate category.

    • otp
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      You’re definitely right, and I’m aware. The smaller the sample size, though, the more likely an AI art generator would create something that looks very similar to a given individual.

      As well, some AI art generators accept prompt images to use as a starting point.

      • Cryophilia@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Ok but that’s a pretty niche thing to be worried about, is my point. You can’t apply that broadly to all AI porn.

        • otp
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Not so much when it comes to prompt images.