A lawsuit filed by more victims of the sex trafficking operation claims that Pornhub’s moderation staff ignored reports of their abuse videos.


Sixty-one additional women are suing Pornhub’s parent company, claiming that the company failed to take down videos of their abuse as part of the sex trafficking operation Girls Do Porn. They’re suing the company and its sites for sex trafficking, racketeering, conspiracy to commit racketeering, and human trafficking.

The complaint, filed on Tuesday, includes what it claims are internal emails obtained by the plaintiffs, represented by Holm Law Group, between Pornhub moderation staff. The emails allegedly show that Pornhub had only one moderator to review 700,000 potentially abusive videos, and that the company intentionally ignored repeated reports from victims in those videos.

The damages and restitution they seek amounts to more than $311,100,000. They demand a jury trial, and seek damages of $5 million per plaintiff, as well as restitution for all the money Aylo, the new name for Pornhub’s parent company, earned “marketing, selling and exploiting Plaintiffs’ videos in an amount that exceeds one hundred thousand dollars for each plaintiff.”

The plaintiffs are 61 more unnamed “Jane Doe” victims of Girls Do Porn, adding to the 60 that sued Pornhub in 2020 for similar claims.
Girls Do Porn was a federally-convicted sex trafficking ring that coerced young women into filming pornographic videos under the pretense of “modeling” gigs. In some cases, the women were violently abused. The operators told them that the videos would never appear online, so that their home communities wouldn’t find out, but they uploaded the footage to sites like Pornhub, where the videos went viral—and in many instances, destroyed their lives. Girls Do Porn was an official Pornhub content partner, with its videos frequently appearing on the front page, where they gathered millions of views.

read more: https://www.404media.co/girls-do-porn-victims-sue-pornhub-for-300-million/

archive: https://archive.ph/zQWt3#selection-593.0-609.599

  • @Ookami38
    link
    English
    109 months ago

    You know, I’ll give you this much - there’s not much evidence on either side that it is a safe outlet. Until there is, the only metric we can really use is what level of harm is a thing existing , doing? And in the case of AI generated porn of ANY kind, it’s no one. I’ll accept that it may cause long-term societal harm, once I see proof.

      • @Ookami38
        link
        English
        129 months ago

        Are you actually kidding? You’re literally proposing thought policing lol

          • @Ookami38
            link
            English
            129 months ago

            You’re equating things that go on only in your thoughts with things that actually happen. If you say these things are the same, which you did, then you’re saying they should be punished the same. Otherwise, shocker, they’re not the same.

            Thoughts=actions Actions=bad Thoughts=bad

            Simple commutative property taught in elementary school.

            Thoughts=actions Actions=punished Thoughts=punished

            Again, simple commutative property. You are either saying thoughts are the same as actions and should be policed as actions, or you’re saying thoughts aren’t the same as actions.

      • @[email protected]
        link
        fedilink
        English
        109 months ago

        So you must also think murder mystery books are horrid crimes? As are horror movies? Lot of murder depicted there. Accepting that is akin to accepting murder itself. Whats the difference? Youre getting off and entertained by murder? Clearly your desire is no different.