• Lowlee Kun
    link
    fedilink
    191 month ago

    Can’t generate Abuse Material without Abuse. Generative AI does not need any indecent training to be able to produce indecent merial.

    But it is a nice story to shock and scare many people so i guess the goal is reached.

    • @mindbleach
      link
      -11 month ago

      Yes! “Generated CSAM” is a CONTRADICTION. The entire god-damn point of calling it CSAM is to clarify: it is photographic evidence of sexual assault on children. No assault occurred, when Stable Diffusion spits out images that make you feel icky. It’s a drawing. Zero children were involved. It is a fantasy of linear algebra.

      You cannot abuse children who do not exist.

      We’re talking about a near future where everyone has AI generator nonsense thrust upon them by Microsoft themselves, and entering “woman naked 18” will instantly drown you in bespoke no-questions-asked pornography, but mistyping “woman naked 17” will be treated like you raped a toddler.

      And a network that can generate a wizard raising zombies from a swamp made of ice cream fucking obviously does not need real photos of that exact thing. This tech mashes concepts together. Any data set’s gonna have children. Any data set’s gonna have nudity. It can combine those as readily as anything else. It has no moral opinion. That’s how the world got photorealistic pornography of Pokemon.