Taiwanna Anderson’s life changed forever in December 2021, when she found her 10-year-old daughter Nylah unconscious, hanging from a purse strap in a bedroom closet.

Barely an adolescent, Nylah wasn’t suicidal. She had merely come across the “Blackout Challenge” in a feed of videos curated her for her by TikTok’s algorithm. The challenge circulating on the video-sharing app encouraged users to choke themselves with household items until they blacked out. When they regained consciousness, they were supposed to then upload their video results for others to replicate. After several days in a hospital’s intensive care unit, Nylah succumbed to her strangulation injuries. Anderson sued TikTok over product liability and negligence that she alleges led to Nylah’s death.

For years, when claimants tried to sue various internet platforms for harms experienced online, the platforms benefited from what amounted to a get-out-of-jail-free card: Section 230 of the Communications Decency Act, a 1996 statute that offers apps and websites broad immunity from liability for content posted to their sites by third-party users. In 2022, a federal district judge accepted TikTok’s Section 230 defense to dismiss a lawsuit filed by Anderson based on the assessment that TikTok didn’t create the blackout challenge video Nylah saw—a third-party user of TikTok did.

But on Tuesday, the federal Third Circuit Court of Appeals released an opinion reviving the mother’s lawsuit, allowing her case against TikTok to proceed to trial. TikTok may not have filmed the video that encouraged Nylah to hang herself, but the platform “makes choices about the content recommended and promoted to specific users,” Judge Patty Shwartz wrote in the appellate court’s opinion, “and by doing so, is engaged in its own first-party speech.”

  • DaCrazyJamez
    link
    fedilink
    arrow-up
    6
    arrow-down
    10
    ·
    3 months ago

    I hate to take Tik Toks side in anything, but in this case I dont think they should be liable. A kid made a dumb choice based on a stupid video…but even if it showed up in a feed, Tik Tok didnt e courage the behavior.

    This one is a sad case of darwinism.

    • Diplomjodler@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      3 months ago

      Wrong. Children are being shown these videos in the name of “engagement”, i.e. in order to maximise profits. That is the main problem with oligopolistic social media platforms. It’s the algorithms that are destroying society.