Taiwanna Anderson’s life changed forever in December 2021, when she found her 10-year-old daughter Nylah unconscious, hanging from a purse strap in a bedroom closet.

Barely an adolescent, Nylah wasn’t suicidal. She had merely come across the “Blackout Challenge” in a feed of videos curated her for her by TikTok’s algorithm. The challenge circulating on the video-sharing app encouraged users to choke themselves with household items until they blacked out. When they regained consciousness, they were supposed to then upload their video results for others to replicate. After several days in a hospital’s intensive care unit, Nylah succumbed to her strangulation injuries. Anderson sued TikTok over product liability and negligence that she alleges led to Nylah’s death.

For years, when claimants tried to sue various internet platforms for harms experienced online, the platforms benefited from what amounted to a get-out-of-jail-free card: Section 230 of the Communications Decency Act, a 1996 statute that offers apps and websites broad immunity from liability for content posted to their sites by third-party users. In 2022, a federal district judge accepted TikTok’s Section 230 defense to dismiss a lawsuit filed by Anderson based on the assessment that TikTok didn’t create the blackout challenge video Nylah saw—a third-party user of TikTok did.

But on Tuesday, the federal Third Circuit Court of Appeals released an opinion reviving the mother’s lawsuit, allowing her case against TikTok to proceed to trial. TikTok may not have filmed the video that encouraged Nylah to hang herself, but the platform “makes choices about the content recommended and promoted to specific users,” Judge Patty Shwartz wrote in the appellate court’s opinion, “and by doing so, is engaged in its own first-party speech.”

  • @ted
    link
    315 days ago

    Trending posts is not a recommendation algorithm, in my opinion. I think the slope stops at curation (interests-based algorithms targeted at specific users).

    I.e. reddit homepage is now curated, Lemmy sorting methods are not.

    • @[email protected]
      link
      fedilink
      215 days ago

      Lemmy sorting is still interest based if your not scrolling through /all , it’s just that those are declared interests, you subscribe to the tennis community, as opposed to inferred interests, the algorithm figured out you like tennis based on your watching habits. It’s still curated it’s just self curated instead of algorithmically curated.

      So I guess you could say it stops at how the interests are compiled and whether the interest was given explicitly by the user but then you get into how a user understands certain actions like likes. Do people like something to just give feedback to the poster, then it shouldn’t be used at all. Do they like something because they want to boost it and have their wider community to see it, then the algorithm can take that into account when giving it to friends / followers. Do they like something because they want to see more of it, then the algorithm can use that information for recommending things that user will see. My guess is people use it as some combination of all 3, and as long as the social media tells its users at the beginning that the heart button is all 3 they could get away with saying there algorithm is explicit while not changing much.

    • @[email protected]
      link
      fedilink
      115 days ago

      Then if they go this route, they better make sure that they clearly define what they mean by a “Recommendation Algorithm” or an “interests-based algorithm” because the opinions of individuals won’t hold up in court.

      If it’s not defined an attorney could easily argue that Lemmy’s “Scaled” algorithm is a “recommendation algorithm” and you would hope that the judge understood enough about programming to know where to draw the line.