It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.
Like the comments on this post here.
https://sh.itjust.works/post/6220815
I find this argument crazy. I don’t even know where to begin to talk about how many ways this will go wrong.
My views ( which are apprently not based in fact) are that AI CSAM is not really that different than “Actual” CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.
Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.
Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.
Using drugs has no inherent victim. And it is not predatory.
I could go on but im not an expert or a social worker of any kind.
Can anyone link me articles talking about this?
The way I see it is like this, Pedos, like everybody else, have sexual urges, and like most people, they will try to do something about them… but they can’t for obvious reasons. Since the stigma against pedophilia is so great, a lot them are too afraid to come out and get help. Therefore, they tend to feel repressed. They’re in a situation where they can’t do what they want and they’re too afraid to get help, and so they try to bottle things up until they snap.
When that happens, they tend to snap in one of two ways. The first way is by trying to seek out CP online and the second is to actually molest a child. Both of these options are terrible because the former generates demand for more CP content to created, and thus puts more children in abusive situations, and the latter is just straight up child rape. I think a lot of them understand the risk of doing either, but they do them anyway either because they don’t care either or because they can’t help themselves. However, there might be a solution for at least some of them.
If we assume that a certain portion of pedophiles are too afraid to do anything but too repressed not to, then providing them with a legal outlet could push at least some them away from harming actual kids. That’s where I see AI generated CP come in. I see it as an outlet, and I don’t think it’s anything new either. We’ve had the lolicon/shotacon shit for awhile, which is basically just drawn CP. As disgusting as it is, it hasn’t resulted in any major spikes in child sexual assault rates as far as I am aware. Therefore, if fake CP, including AI generated CP, doesn’t use any actual CP to generate the images and they can keep pedos away from kids then I don’t see the issue with it. Sure, it is gross, but we do have to remind ourselves that we don’t ban thing on how gross they are. The reason why we ban CP in the first place isn’t because it’s gross, but because it actually harms the kids mentally and physically. If AI generated CP can keep any pedos from harming kids then I see that as a win.