‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • ReluctantMuskrat@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    11 months ago

    It’s a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other “friends” doing it could be just as bad.

    It’s sexual harassment even if fake.

    • Eezyville
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      11 months ago

      I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.