‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • azertyfun
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    Why are you pretending that “nudify apps” are produce ephemeral pictures equivalent to a mental image? They are most definitely not.

    Underage teenagers already HAVE shared fake porn of their classmates. It being illegal doesn’t stop them, and as fun as locking up a thirteen year old sounds (assuming they get caught, prosecuted, and convicted) that still leaves another kid traumatized.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      So if illegality doesn’t stop things from happening… how exactly are you stopping these apps from being made?

      • azertyfun
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Go after the people advertising those apps. Developers and advertisement agencies who say/intentionally imply “create naked pictures of people you know” should all be prosecuted.

        Unlike photoshop or generic SD software, these apps have literally no legitimate reason to exist since the ONLY thing they facilitate is creating non-consensual pornography. Seems like something that would be very easy to criminalize.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          So wait, we can’t criminalize the use, but if we criminalize the advertisement it fixes the situation?

          You realize the exact same problem exists? There are plenty of tools with illegal uses, easily accessible online right now. Many on GitHub.