Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

  • sir_pronoun@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    42
    ·
    8 months ago

    I seriously don’t get why society cares if there are photos of anyone’s private parts.

    • VeganCheesecake@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      39
      arrow-down
      3
      ·
      8 months ago

      I think we as a society are too uptight about nudity, but that doesn’t mean that creating pictures of people without their consent, which make them feel uncomfortable, is in any way OK.

      • Llewellyn@lemm.ee
        link
        fedilink
        arrow-up
        1
        arrow-down
        10
        ·
        8 months ago

        What about photos of politics in compromising situations? Should we have them without their consent?

      • damnthefilibuster@lemmy.world
        link
        fedilink
        arrow-up
        37
        arrow-down
        10
        ·
        8 months ago

        They are humiliated only because society has fed them the idea that what they’ve done (in this case not done but happened to them) is wrong. Internalizing shame meted out by society is the real psychological problem we need to fix.

        • tsonfeir@lemm.ee
          link
          fedilink
          arrow-up
          13
          arrow-down
          2
          ·
          8 months ago

          Society does indeed play a big role, but if someone went around telling lies about you that everyone believed regardless of how much you denied it, that would take a toll on you.

        • sir_pronoun@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          4
          ·
          8 months ago

          That’s what I meant. Why should it be shameful? If it weren’t, those photos would lose so much of their harm.

    • natecheese@kbin.melroy.org
      link
      fedilink
      arrow-up
      17
      arrow-down
      1
      ·
      8 months ago

      I think the issue is that there is sexual imagery of the person being created and shared without that persons consent.

      It’s akin to taking nude photos of someone without their consent, or sharing nude photos with someone other than their intended audience.

      Even if there were no stigma attached to nudes, that doesn’t mean someone would their nudes to exist or be shared.

    • prettybunnys
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      8 months ago

      I’ll continue this conversation in good faith only after you’ve shown us yours to prove your position.

    • xePBMg9@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      8 months ago

      Modern surveillance capitalism has made sharing of private data normalised. These days we are very used to sharing pretty much everything about ourselves, in addition to having no control over how that information is used. That is a bad thing.

      I suspect that these tools will, similarly, make nudity a bit more normalised in many societies across the world. That is probably a good thing overall.