THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • @ArbitraryValue
    link
    English
    221 month ago

    distribute, or receive the deepfake pornography

    Does this make deepfake pornography more restricted than real sexual images either created or publicly released without consent?

    • @[email protected]
      link
      fedilink
      251 month ago

      I think so. In a way, it makes sense – a lot of people are of the (shitty) opinion that if you take lewd pictures of yourself, it’s your fault if they’re disseminated. With lewd deepfakes, there’s less opportunity for victim blaming, and therefore wider support.

      • @ArbitraryValue
        link
        English
        2
        edit-2
        1 month ago

        Maybe, but my understanding is that it’s legal to have photos of someone actually being the victim of a sex crime, not just photos of consensual acts.