Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • Schadrach@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    10
    ·
    8 months ago

    so I don’t understand what your point here is

    It’s that all the articles over the last year screaming about the dangers of AI because it can be used for something an interested high school student could use an image editor to do 30 years ago but more easily and arguably at somewhat better quality (depending on the person using photoshop) are being ridiculous because they’re blaming the technology instead of the weirdo using it to doctor an image of that girl at their school and pass it around. And yes, anyone who makes and distributes on of these images of someone should be nailed for revenge porn, harassment and whatever else might apply. I say “and distributes” only because if they never distribute it no one would ever know it exists so there would be no opportunity to bust them.

    The best use (ie only good use) for one of these is to feed it an image of something that is definitely not the right kind of image for it and seeing what horrors it invents trying to fill in the blanks. Hand it your buddy with a beer belly and a mountain man beard or a dog or garden gnome something.

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      8 months ago

      Generative AI is being used quite prominently for the purposes of making nonconsensual pornography. Just look at the state of CivitAI, the largest marketplace of Stable Diffusion models online. It pretends to be a community for Machine Learning professionals, but behind the scenes it’s laying the groundwork for all of the problems we’re seeing right now. There’s not an actress or female celebrity that doesn’t have a TI or LoRA trained on their likeness - and the galleries don’t hold back on showing you what these models can do.

      At least Photoshop never gained the specific reputation of being a tool for making fake porn, but the GenAI community is leaving no doubt that this is a major use case for image models.

      Even HuggingFace turns a blind eye to pornifying models and lolicon datasets, and they’re basically the GitHub of AI models…

      • ArmokGoB@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Knowledge of fission is often applied to make nuclear bombs, but also to generate nuclear power. We shouldn’t blame AI as a whole for this just because some creeps use it for shitty applications.

        • BreakDecks@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 months ago

          That’s kinda why I brought up specific key players and how I consider them complicit. If you don’t want AI to be blamed as a whole, you should want those key players to behave ethically, or they’ll poison public perception of AI as a whole.