A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • @brbposting
    link
    English
    163 months ago

    It’s unacceptable.

    We have legal and justice systems to deal with this.

    For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

    Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

    Telegram got right on it (not). Fuckers.

    • @mindbleach
      link
      English
      23 months ago

      Obligatory nitpick: you can’t generate CSAM. That’s not what “CSAM” means. The entire point of calling it that is to distinguish actual pictures of actual children suffering real-life sexual abuse… versus things that didn’t happen.

      It’s like someone claiming “I’ll generate proof of murder.” You sure won’t, bucko. Whatever gross image you produce - it will never be that. Especially if the guy in the image is still walking around.

      • @brbposting
        link
        English
        23 months ago

        You sure won’t, bucko.

        Ahaha true!

        Clarifying: my copied comment used “generate” to mean “produce”.

        In any case your point is well taken. Something to think about.

        • @mindbleach
          link
          English
          13 months ago

          Ahh, yeah, I guess that’d be the definition used prior to… all of this.

          Though doing that on-demand for ten bucks is a whole different kind of fucked up.