• Ignotum@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      1 year ago

      AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse

      AI can generate faces of people that don’t actually exist, that’s what i mean

      The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children

        • Ignotum@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Currently pedos tend to group up and share real csam, and these “communities” probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they’re more likely to seek help, and as a bonus, real children aren’t preyed upon to create said csam?

          And saying that removing ai tools that can generate csam will lead to them “attempt to fuck children in the streets” as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?

      • Neato@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        16
        ·
        1 year ago

        AI CSAM is incredibly harmful. All CSAM is harmful. It’s been shown to increase chance of pedophilic abuse.

        Stop defending CSAM, HOLY SHIT.

        • Helix 🧬@feddit.de
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          edit-2
          1 year ago

          It’s been shown to increase chance of pedophilic abuse.

          Can you link me a source for that, please?

        • Ignotum@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          1
          ·
          1 year ago

          Jeez, calm down

          I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.

          Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.