• m0darn@lemmy.ca
    link
    fedilink
    arrow-up
    30
    ·
    1 year ago

    Did nobody in this comment section read the video at all?

    The only case mentioned by this video is a case where highschool students distributed (counterfeit) sexually explicit images of their classmates which had been generated by an AI model.

    I don’t know if it meets the definition of CSAM because the events depicted in the images are fictional, but the subjects are real.

    These children do exist, some have doubtlessly been traumatized by this. This crime has victims.