Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • CaptainEffort
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    Idk how many people are viewing this as normal, I think most of us recognize all of this as being incredibly weird and creepy.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      7 months ago

      Idk how many people are viewing this as normal

      Maybe not “Lemmy” us. But the folks who went hog wild during The Fappening, combined with younger people who are coming into contact with pornography for the first time, make a ripe base of users who will consider this the new normal.

      • CaptainEffort
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Yeah damn, that’s true.

        An obvious answer would be to talk to younger people about it, to explain how gross and violating it is. Even if it doesn’t become illegal, there are plenty of legal things that people avoid and recognize are bad because they were taught correctly.

        Unfortunately, due to how puritan our society is, I can’t imagine many parents would be willing to talk to their kids about stuff like this.