I’m afraid Europol is shooting themselves in the foot here.
What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.
Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.
As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.
What would stop someone from creating a tool that tagged real images as AI generated?
Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.
Some form of digital signatures for allowed services?
Sure, it will limit the choice of where to legally generate content, but it should work.
I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.
Open-source models exist and can be forked
This relies on the idea that “outlet” is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who’d have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.
IIRC there was actually a study and pedos with access to synthetic CSAM were less likely to victimize real children.
You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.
I hope they don’t have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.
It will be hilarious to see them attempt it though.
Sadly it seems like most of Europe and potentially other “western” countries will follow
I haven’t read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I’ve been told having access to drawings and images and whatnot makes people more likely to act on their impulses.
And like. I don’t think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.
Really couldn’t give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.
As an advocate for online and offline safety of children, I did read into the research. None of the research I’ve found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.
For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.
Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.
They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.
I totally agree with these guys being arrested. I want to get that out of the way first.
But what crime did they commit? They didn’t abuse children…they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it’s flat, but where’s the line here? If they draw pictures of non-existent children is that also a crime?
Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it’s not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?
It just seems entirely unenforceable and an entire goddamn can of worms…
First off I’ll say this topic is very nuanced. And as sick as any child porn is I completely agree. This, in my gut, feels like a weird slippery slope that will somehow get used against any AI generated images or possibly any AI generated content. It makes me feel like those “online child protection” bills that seem on the surface like not terrible ideas, but when you start thinking about them in detail are horrific dystopian ideas.
I actually do not agree with them being arrested.
While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.
AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.
By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there’s one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.
Nah the argument that this could grow “pedophile culture” and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.
The thing is, banning is also a consequential action.
And based on what we know about similar behaviors, having an outlet is likely to be good.
Here, the EU takes an approach of “banning just in case” while also ignoring the potential implications of such bans.
It’s strange to me that it is referred to as CSAM. No people are involved so no one is a being sexually assaulted. It’s creepy but calling it that implies a drawing is a person to me.
Exactly, which is why I’m against your first line, I don’t want them arrested specifically because of artistic expression. I think they’re absolutely disgusting and should stop, but they’re not harming anyone so they shouldn’t go to jail.
In my opinion, you should only go to jail if there’s an actual victim. Who exactly is the victim here?
It obviously depends on where they live and/or committed the crimes. But most countries have broad laws against anything, real or fake, that depicts CSAM.
It both because as technology gets better it would be easy for offenders to claims anything they’ve been caught with is AI created.
It’s also because there’s a belief that AI generated CSAM encourages real child abuse.
I shan’t say whether it does - I tend to believe so but haven’t seen data to prove me right or wrong.
Also, at the end, I think it’s simply an ethical position.
If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.
There have been controversies about that sort of thing.
I know the Oscar-winning movie The Tin Drum as an example. The book by Günter Grass is a very serious, highly celebrated piece of German post-war literature. It takes place around WW2. The protagonist has the mind of an adult in the body of a child. I guess the idea is that he is the other way around from most people?
The movie was banned in Ontario and Oklahoma, for a time. https://en.wikipedia.org/wiki/The_Tin_Drum_(film)#Censorship
With European societies shifting right, I doubt such a movie could be made today, but we aren’t at a point where it would be outright illegal.
Good to have data points as reference points to at least guide the discussion.
Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.
I get how fucking creepy and downright sickening this all feels, but I’m genuinely surprised that it’s illegal or criminal if there’s no actual children involved.
It mentions sexual extortion and that’s definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.
It’s certainly creepy and disgusting
It also seems like we’re half a step away from thought police regulating any thought or expression a person has that those in power do not like
Exactly. If there’s no victim, there’s no crime.
It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don’t count.
It sounds like a very iffy thing to police. Since drawn stuff doesn’t have actual age, how do you determine it? Looks? Wouldn’t be great.
Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.
I mean that’s the same thing with AI generated content. It’s all trained on a wide range of real people, how do you know what’s generated isn’t depicting an underage person, which is why laws like this are really dangerous.
Exactly. Any time there’s subjectivity, it’s ripe for abuse.
The law should punish:
- creating images of actual underage people
- creating images of actual non-consenting people of legal age
- knowingly distributing one of the above
Each of those has a clearly identifiable victim. Creating a new work of a fictitious person doesn’t have any clearly identifiable victim.
Don’t make laws to make prosecution easier, make laws to protect actual people from becoming victims or at least punish those who victimize others.
Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.
As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.
The only way to generate something like that is to teach it something like that from real images.
only way
That’s just not true.
That said, there’s a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there’s a good chance they don’t understand how they work, but the creators of the model should absolutely know where they’re getting the source data from.
Prove that the models use illegal material and go after the model creators for that, because that’s an actual crime. Don’t go after people using the models who are providing alternatives to abusive material.
I think all are unethical, and any service offering should be shut down yes.
I never said prosecute the user’s.
I said you can’t make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.
the odds of human explotations at some point in the chain are just too high
We don’t punish people based on odds. At least in the US, the standard is that they’re guilty “beyond a reasonable doubt.” As in, there’s virtually no possibility that they didn’t commit the crime. If there’s a 90% chance someone is guilty, but a 10% chance they’re completely innocent, most would agree that there’s reasonable doubt, so they shouldn’t be convicted.
If you can’t prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.
Services should only be shut down if they’re doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing “beyond a reasonable doubt” that they were committing a crime. That’s how the law works, you only punish people you can prove “beyond a reasonable doubt” were committing a crime.
How can it be made ethically?
That’s my point.
It can’t.
Some human has to sit and make many, many, many models of genitals to produce an artificial one.
And that, IMO is not ethically possible.
How can it be made ethically?
Let’s say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.
I don’t know, I’m not an expert. But just because I don’t know of something doesn’t mean it doesn’t exist, it means I need to consult experts.
It can’t.
Then prove it. That’s how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.
My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It’s not CSAM if it’s generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn’t use CSAM in its training data.
You can’t prove a negative. That’s not how prooving things work.
You also assume legal images. But that puts limits on what’s actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?
You assume it can, prove that it can.
I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.
That’s not how these image generators work.
How would it know what an age appropriate penis looks like with our, you know, seeing one.
That’s exactly how they work. According to many articles I’ve seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.
How can it be trained to produce something without human input.
To verify it’s models are indeed correct, some human has to sit and view it.
Will that be you?
How can it be trained to produce something without human input.
It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.
No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.
It was able to produce that because enough images of both feet and Donald Trump exist.
How would it know what young genitals look like?
Much as all in modern AI - it’s able to train without much human intervention.
My point is, even if results are not perfectly accurate and resembling a child’s body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that’s what matters.
I do not care about how accurate it is, because it’s not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.
no, it sort of is. considering style transfer models, you could probably just draw or 3d model unknown details and feed it that.
Again, that’s not how image generators work.
You can’t just make up some wishful thinking and assume that’s how it must work.
It takes thousands upon housands of unique photos to make an image generator.
Are you going to draw enough child genetalia to train these generators? Are you actually comfortable doing that task?
i’m not, no. but i’m also well-enough versed in stable diffusion and loras that i know that even a model with no training on a particular topic can be made to produce it with enough tweaking, and if the results are bad you can plug in an extra model trained on at minimum 10-50 images to significantly improve them.
Okay, but my point still stands.
Someone has to make the genitals models to learn from. Some human has to be involved otherwise it wouldn’t just exist.
And if your not willing to get your hands dirty and do it, why would anyone else?
On one hand I don’t think this kind of thing can be consequence free (from a practical standpoint). On the other hand… how old were the subjects? You can’t look at a person to determine their age and someone that looks like a child but is actually adult wouldn’t be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.
This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.
I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.
So could I, but that doesn’t make it just. It should only be a crime if someone is actually harmed, or intended to be harmed.
Creating a work about a fictitious individual shouldn’t be illegal, regardless of how distasteful the work is.
I think it’s pretty stupid. Borders on Thought Crime kind of stuff.
I’d rather see that kind of enforcement and effort go towards actually finding people who are harming children.
I’ve read it being defined as “victimless crime”; not that I condone it, but thinking about the energy and resources spent for such a large operation… about drawn porn? Cmon.
This is also my take: any person can set up an image generator and churn any content they want. Focus should be on actual people being trafficed and abused.
There’s a few in the White House.
Even people who want this sort of image banned need to stop describing it like the real thing. These headlines drive me up the wall - like seeing people arrested for “simulated murder.”
Ehhhhh…
It also borders on real CSAM
Paracetamol “borders on” poison, but isn’t.
Slippery slope is a logical fallacy, and there are actual consequences here. We need to do better.
Walk me through how any rendering “borders on” proof that a child was raped.
There are no subjects. It’s image gloop. You cannot exploit or harm JPEG artifacts.
Think of it as a drawing. If you’re sketching from life… yeah, jail. Otherwise what are we even talking about?
It’s not a gray area at all. There’s an EU directive on the matter. If an image appears to depict someone under the age of 18 then it’s child porn. It doesn’t matter if any minor was exploited. That’s simply not what these laws are about.
Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It’s not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.
17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that’s possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.
Anyway, what I’m saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.
There’s not an epidemic of child porn.
There’s an epidemic of governments wanting greater surveillance powers over the Internet and it is framed as being used to “fight child porn”.
So you’re going to hear about every single case and conviction until your perception is that there is an epidemic of child porn.
“You can’t possibly oppose these privacy destroying laws, after all you’re not on the side of child porn are you?”
Same with misinformation. Where anything they disagree with, in good faith or not, is misinformation.
It’s all part of ‘manufacturing consent’.
There’s plenty of material out in academia about it (as always check your sources), if you want to get into the weeds
Legality is not the same as morality.
It’s not a gray area at all. There’s an EU directive on the matter. If an image appears to depict someone under the age of 18 then it’s child porn.
So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don’t look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.
In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.
Is it though? I don’t know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I’m not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.
That’s a directive, it’s not a regulation, and the directive calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren’t. Germany, for example, splits the whole thing into under 14 and 14-18.
We certainly don’t arrest youth for sending each other nudes:
(4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.
…their own nudes, that is. Not that of classmates or whatnot.
I feel like a lot of people missed which parts of this were a joke.
Followed swiftly by operation jizzberworld
You cannot generate CSAM. That’s the entire point of calling it CSAM!
CSAM stands for “photographic evidence of child rape.” If there’s no child - then that didn’t happen, did it? Fictional crimes don’t tend to be identically illegal, for obvious reasons. Even if talking about murder can rise to the level of a crime - it’s not the same crime, because nobody died.
distribution of images of minors fully generated by artificial intelligence
What minors? You’re describing victims who are imaginary. Zero children were involved. They don’t fucking exist.
We have to treat AI renderings the same way we treat drawings. If you honestly think Bart Simpson porn should be illegal - fine. But say that. Say you don’t think photographs of child rape are any worse than or different from a doodle of a bright yellow dick. Say you want any depiction of the concept treated as badly as the real thing.
Because that’s what people are doing, when they use the language of child abuse to talk about a render.
CSAM stands for “photographic evidence of child rape.”
Why not use the actual acronym definition?
Child Sexual Abuse Material
It’s just as clear that it’s about abuse (more than just rape). You can’t abuse someone who doesn’t exist…
Nevermind that it’s right there in the headline - do you know how a joke works?
Where’s the joke?
What you’re doing is using hyperbole, and it’s completely unnecessary to make your point.
I should not have to explain ‘hyperbole that’s obviously not literally true can be used to exaggerate and underline a point’ when that is also the plain text of your comment. What are we doing, here? Are you okay?
I’m just pointing out that your point completely stands without the hyperbole.
If you’re saying CSAM = rape, then you open yourself up to a ton of irrelevant arguments proving that it’s not rape, when the point you should be defending is that it’s abuse. Taking a picture of a kid from a distance isn’t rape, but it is abuse. See the difference? The hyperbole just weakens your position, and that’s what I’m trying to point out.
You’re worried about my deliberately wrong acronym gag… because some sexual abuse isn’t quite rape… when cops don’t even care whether these children exist.
Pass.
Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.
And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can’t die, so they can’t produce corpses. But we can call them prosthetic corpses because they’re prosthetics to simulate real corpses.
I’m curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.
Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1’s and 0’s.
Should we ban that and arrest people that use it?
Nobody is saying we should ban AI and arrest the people using it.
Should we arrest people who use photoshop to edit the clothing off of children to produce CSEM? YES! Why is that your defense of this?..
Take a picture of a child and put the face on someone else body. Should we arrest people for doing that?
YES! Creating CSEM is illegal in a lot of jurisdictions.
Do you want people doing that for your kids?
Hell, CSEM can make a lot of money. Are you going to do that with your own kids? Help them save up for their education!
At that point you have an actual victim and evidence of harm. If the image depicts an actual person, you run into a ton of other laws that punish such things.
But if the child doesn’t actually exist, who exactly is the victim?
Yeah, it would be CSAM if it were real. But it’s not, so it’s not CSAM, therefore no victim.
I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)
CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.
AI-generated CSAM means the AI produced images that depict sexual exploitation of children.
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn’t mean anybody was killed. It’s not illegal to own images of murder scenes, but it’s often illegal to own images of CSEM.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
And that’s where I take issue. It shouldn’t be legal to prosecute someone without a victim.
That doesn’t change the law, so you have good advice here. But if I’m put on a jury on a case like this, I would vote to nullify.
Yeah can’t imagine why evidence of child rape would deserve special consideration. We only invented the term CSAM, as opposed to CP, specifically to clarify when it is real, versus the make-believe of ‘wow, this would be bad, if it wasn’t made up.’
Do you think CGI of a dead body is the same as murder?
That’d be bad if it was real! It’d be murder! But it - fucking - isn’t, because it’s not real.
I must underline, apparently: the entire reason for using the term “child sexual abuse material” is to describe material proving a child was sexually abused. That’s kinda fucking important. Right? That’s bad in the way that a human person dying is bad. If you treated James Bond killing someone, in a movie, the same way you treated an actual human person dying, people would think you’re insane. Yet every fucking headline about these renders uses the same language as if typing words into a program somehow fucked an actual child.
The term “computer (or digitally) generated child sexual abuse material” encompasses all forms of material representing children involved in sexual activities and/or in a sexualised manner, with the particularity that the production of the material does not involve actual contact abuse of real children but is artificially created to appear as if real children were depicted. It includes what is sometimes referred to as “virtual child pornography” as well as “pseudo photographs”.
[…]
There is nothing “virtual” or unreal in the sexualisation of children, and these terms risk undermining the harm that children can suffer from these types of practices or the effect material such as this can have on the cognitive distortions of offenders or potential offenders. Therefore, terms such as “computer-generated child sexual abuse material” appear better suited to this phenomenon [than virtual child pornography].
- Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse, section F.4.ii
There’s a reputable source for the terminology usage.
If you want to keep defending CG CSAM, take it up with the professionals
“The professionals are also full of shit” is not much of an argument.
I’m going to hold the words of the people who are actually fighting against child exploration in much higher regard than someone who is defending AI-generated CSAM/CSEM. And honestly, I don’t understand why you’re defending it. It’s weirding me out…lol
As I wrote in another comment,
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn’t mean anybody was killed.
That’s all this is.
That doesn’t mean anybody was killed.
Or that any child was “explored.”
I’m fucking disappointed that anyone professionally engaged in this wants to equate damning evidence of physical abuse with generic representation of the concept - for the exact reasons already described.
There is an insurmountable difference between any depiction of a sex crime involving fictional children - and the actual sexual abuse of real living children. Fuck entirely off about throwing aspersions for why this distinction matters. If you don’t think child rape is fundamentally worse than some imagined depiction of same - fuck you.
equate damning evidence of physical abuse with generic representation
That’s not what it is.
Just like AI-generated murder scenes are not being equated to physical evidence of someone having been murdered.
I think you’re getting caught up in semantics. Can we at least agree that those AI-generated images are bad?