I work for a hosting provider and recently we received a report about a user hosting AI generated CSAM, I verified it and forwarded it to the legal team. They told him to GTFO.
He left a negative review because we “wouldn’t let him host AI generated content”. Nuh-uh sir, that is not why. Some people are just so out of touch with reality.
I thought the same, but the article puts it into perspective - first training anything requires having content on hand, and that means children were exploited to get it. Second, by enabling and allowing it it allows only deeper desires, possibly pushing people even further. Definitely above our paygrades here to determine if that’s good or bad. The letter calls out that by allowing it we as society are normalizing it, saying it’s okay, and we definitely do not want it normalized.
Unfortunately, this does probably mean the end of the wild west of AI generation. I don’t think this will stop tools from being created, it sounds like (even though Ars somehow demonized the term ‘open source’) tools like SD and training are going to stick around. (Probably because they know they’d just be forked and continued on later). But I think we’re going to see a lot more regulation on sharing models. Right now there are a few obvious sites with things being shared that even I’ve been shocked are allowed, I think we’re going to see a lot more rules on what can and can’t be uploaded.
Even as the tech gets easier, fine tuning and training models is a much more involved process that most people won’t want to go through, and so stopping the sharing of those models will cut down on the vast majority. (Similar to those who share vs consume illegal material already on the internet)
Looks like the first step they’re calling for is basically saying even if CSAM was created by AI, it should still be classified legally as CSAM, and so trading it and sharing it means a one way ticket to jail, and I’m okay with that personally.
Actually, the image generation models can most likely generate that kind of material without ever having seen it before. There probably are people out there training them with real material, but it’s not an absolute prerequisite. They can generalize enough to create that combination from legal pornography and normal images of children
Yup, same reason why you can ask for a fox using a crocodile as a mech and get a good result. The model has the concept of all things requested and mixes them (with varying success).
What a terrible thing to try to unravel. And something we as a society should be very focused on solving.
Obviously there is little someone can do to prevent it since it can run locally. Making the exchange of CSAM illegal is easier.
I just hope the AI stuff reduces the exploitation of real children.
Then maybe we can focus on therapy for these sick minded fucks that create and consume CSAM.
I work for a hosting provider and recently we received a report about a user hosting AI generated CSAM, I verified it and forwarded it to the legal team. They told him to GTFO.
He left a negative review because we “wouldn’t let him host AI generated content”. Nuh-uh sir, that is not why. Some people are just so out of touch with reality.
I thought the same, but the article puts it into perspective - first training anything requires having content on hand, and that means children were exploited to get it. Second, by enabling and allowing it it allows only deeper desires, possibly pushing people even further. Definitely above our paygrades here to determine if that’s good or bad. The letter calls out that by allowing it we as society are normalizing it, saying it’s okay, and we definitely do not want it normalized.
Unfortunately, this does probably mean the end of the wild west of AI generation. I don’t think this will stop tools from being created, it sounds like (even though Ars somehow demonized the term ‘open source’) tools like SD and training are going to stick around. (Probably because they know they’d just be forked and continued on later). But I think we’re going to see a lot more regulation on sharing models. Right now there are a few obvious sites with things being shared that even I’ve been shocked are allowed, I think we’re going to see a lot more rules on what can and can’t be uploaded.
Even as the tech gets easier, fine tuning and training models is a much more involved process that most people won’t want to go through, and so stopping the sharing of those models will cut down on the vast majority. (Similar to those who share vs consume illegal material already on the internet)
Looks like the first step they’re calling for is basically saying even if CSAM was created by AI, it should still be classified legally as CSAM, and so trading it and sharing it means a one way ticket to jail, and I’m okay with that personally.
Actually, the image generation models can most likely generate that kind of material without ever having seen it before. There probably are people out there training them with real material, but it’s not an absolute prerequisite. They can generalize enough to create that combination from legal pornography and normal images of children
Yup, same reason why you can ask for a fox using a crocodile as a mech and get a good result. The model has the concept of all things requested and mixes them (with varying success).