Europol has supported authorities from 19 countries in a large-scale hit against child sexual exploitation that has led to 25 arrests worldwide. The suspects were part of a criminal group whose members were engaged in the distribution of images of minors fully generated by artificial intelligence (AI).
I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)
CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.
AI-generated CSAM means the AI produced images that depict sexual exploitation of children.
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn’t mean anybody was killed. It’s not illegal to own images of murder scenes, but it’s often illegal to own images of CSEM.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
And that’s where I take issue. It shouldn’t be legal to prosecute someone without a victim.
That doesn’t change the law, so you have good advice here. But if I’m put on a jury on a case like this, I would vote to nullify.
I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)
CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.
AI-generated CSAM means the AI produced images that depict sexual exploitation of children.
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn’t mean anybody was killed. It’s not illegal to own images of murder scenes, but it’s often illegal to own images of CSEM.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
And that’s where I take issue. It shouldn’t be legal to prosecute someone without a victim.
That doesn’t change the law, so you have good advice here. But if I’m put on a jury on a case like this, I would vote to nullify.