• sugar_in_your_tea
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    At that point you have an actual victim and evidence of harm. If the image depicts an actual person, you run into a ton of other laws that punish such things.

    But if the child doesn’t actually exist, who exactly is the victim?

    Yeah, it would be CSAM if it were real. But it’s not, so it’s not CSAM, therefore no victim.

    • otp
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)

      CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.

      AI-generated CSAM means the AI produced images that depict sexual exploitation of children.

      You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn’t mean anybody was killed. It’s not illegal to own images of murder scenes, but it’s often illegal to own images of CSEM.

      Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.

      • sugar_in_your_tea
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.

        And that’s where I take issue. It shouldn’t be legal to prosecute someone without a victim.

        That doesn’t change the law, so you have good advice here. But if I’m put on a jury on a case like this, I would vote to nullify.