A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any “AI creating tools” for the next five years in the first known case of its kind.

Anthony Dover, 48, was ordered by a UK court “not to use, visit or access” artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February.

The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and “nudifying” websites used to make explicit “deepfakes”.

Dover, who was given a community order and £200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court.

      • Scratch
        link
        fedilink
        English
        arrow-up
        9
        ·
        7 months ago

        This is pretty similar to restraining orders, make it more difficult and make the consequences more severe.

      • xmunk
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        In the modern world when we have cellphones that can do pretty much anything… it’s fucking hard. There will be a parole officer and monitoring software with periodic physical inspections along with watching his purchases. (That’s, at least, th American approach).

        Usually the way it works is that when this dude slips up once he goes to prison for violating his court order.

      • bobs_monkey@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        7 months ago

        Or a burner laptop/Chromebook/whatever. Couple that with a VPN, using a neighbor’s wifi, public hotspots, etc, I don’t really see how they can realistically enforce someone motivated to do what they’re gonna do.