‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • andrew_bidlaw
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    1 year ago

    It was inevitable. And it tells more about those who use them.

    I wonder how we’d adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won’t be seen as a trusted source of information, they won’t be any unique worth hunting for, or being worried about.

    Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent ‘filter-apps’, but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?

    There’re some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it’s wsy. Who knows?

    I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what’re the long term consequencies for us?

    • LufyCZ@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 year ago

      I think that eventually it might be a good thing, especially in the context of revenge porn, blackmail, etc. Real videos won’t have any weight since they might as well be fake, and as society gets accustomed to it, we’ll see those types of things disappear completely

      • bnaur@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Yep, once anyone can download an app on their phone and do something like this without any effort in realtime it’s going to lose its (shock) value fast. It would be like sketching a crude boobs and vagina on someones photo with MS Paint and trying to use that for blackmail or shaming. It would just seem sad and childish.