jeffw@lemmy.worldM to News@lemmy.world · 7 months ago“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comexternal-linkmessage-square223fedilinkarrow-up1305arrow-down110
arrow-up1295arrow-down1external-link“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comjeffw@lemmy.worldM to News@lemmy.world · 7 months agomessage-square223fedilink
minus-squareGrandwolf319linkfedilinkarrow-up2·7 months agoSo it’s all good as long as they have elf ears or that counts as realistic too?
minus-squarericecakelinkfedilinkarrow-up5·7 months agoTwo things: please don’t generate child like pornography. Legal or not it’s disturbing and gross to even think about. Yes, per the law it must be “virtually indistinguishable”. “the term ‘indistinguishable’ used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.”. If it looks like a “real” elf and not a child wearing an elf costume it would be fine. So long as an ordinary person would know that it’s not a real child being abused, or a real child being depicted (placing a real child’s face on a compromising photo), it’s protected, albeit extremely unpleasant, speech.
So it’s all good as long as they have elf ears or that counts as realistic too?
Two things:
So long as an ordinary person would know that it’s not a real child being abused, or a real child being depicted (placing a real child’s face on a compromising photo), it’s protected, albeit extremely unpleasant, speech.