In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative mea...
When that becomes widespread, photos will be generateable for literally everyone, not just minors but every person with photos online. It will be a societal shift; images will be assumed to be AI generated, making any guilt or shame about a nude photo existing obselete.
I mean, anyone with enough artistic talent can draw whatever they would like right now. With AI image generation, it essentially just gives everyone the ability to draw whatever they want. You can try to fight the tech all you want, but it’s a losing battle.
AI generated porn depicting real people seems like a different and much bigger issue
AI generated CSAM in general, while disgusting, at least doesn’t directly harm people, fabricated nudes most definitely does, regardless of the age of the victim
AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse
AI can generate faces of people that don’t actually exist, that’s what i mean
The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children
Currently pedos tend to group up and share real csam, and these “communities” probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they’re more likely to seek help, and as a bonus, real children aren’t preyed upon to create said csam?
And saying that removing ai tools that can generate csam will lead to them “attempt to fuck children in the streets” as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?
I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.
Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.
Prove it’s fake when some of it of your daughter is making it’s way around school.
You’ve missed the point. Fake or not it does damage to people. And eventually it won’t be possible to determine if it’s real or not.
When that becomes widespread, photos will be generateable for literally everyone, not just minors but every person with photos online. It will be a societal shift; images will be assumed to be AI generated, making any guilt or shame about a nude photo existing obselete.
What a disguising assumption. And the best argument against AI I’ve ever heard.
I mean, anyone with enough artistic talent can draw whatever they would like right now. With AI image generation, it essentially just gives everyone the ability to draw whatever they want. You can try to fight the tech all you want, but it’s a losing battle.
You may not like it, but do you really see another likely scenario?
Disguising or disgusting?
AI generated porn depicting real people seems like a different and much bigger issue
AI generated CSAM in general, while disgusting, at least doesn’t directly harm people, fabricated nudes most definitely does, regardless of the age of the victim
You just implied children aren’t real people.
AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse
AI can generate faces of people that don’t actually exist, that’s what i mean
The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children
deleted by creator
Currently pedos tend to group up and share real csam, and these “communities” probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they’re more likely to seek help, and as a bonus, real children aren’t preyed upon to create said csam?
And saying that removing ai tools that can generate csam will lead to them “attempt to fuck children in the streets” as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?
AI CSAM is incredibly harmful. All CSAM is harmful. It’s been shown to increase chance of pedophilic abuse.
Stop defending CSAM, HOLY SHIT.
Can you link me a source for that, please?
Jeez, calm down
I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.
Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.