TikTok is immune from liability for the death of a 10-year-old girl who strangled herself after watching a “Blackout Challenge” on the social media app, a federal judge ruled Tuesday, citing Communications Decency Act Section 230. Tawainna Anderson sued the platform, claiming it was responsible for the death of her daughter, Nylah. The circumstances are “tragic,” but because Anderson sought to hold TikTok liable as a publisher of third-party content, the platform is immune under Section 230, Judge Paul Diamond wrote in his memo, granting TikTok’s motion to dismiss on “immunity grounds.” TikTok didn’t create the challenge but only made it “readily available” on its app, Diamond wrote: TikTok’s algorithm was a “way to bring the challenge to the attention of those likely to be most interested in it.” Section 230 protects the platform for publishing others’ works, he said: “The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.” FCC Commissioner Brendan Carr drew attention to the case, tweeting TikTok used Nylah’s personal information to serve her the blackout challenge video encouraging users to strangle themselves: “She did that with a purse strap & dies. Court accepts all this as true & rules that § 230 shields TikTok from liability.”
A lawsuit accusing Twitter of sex trafficking should be dismissed because the plaintiffs failed to show the company knowingly benefited from criminal sex trafficking under a 2018 law, the platform argued Friday before the 9th U.S. Circuit Court of Appeals (docket 3:21-cv-00485). The Fight Online Sex Trafficking Act, a 2018 carve-out of Communications Decency Act Section 230, applies “only where a civil defendant meets the criteria for criminal sex trafficking, which Plaintiffs fail to allege against Twitter in multiple respects,” the company said in its third brief on cross-appeal for review. Two minors in the case have sought to remove videos of them having sex from Twitter. The U.S. District Court in San Francisco said the plaintiffs had an avenue of relief under the 2018 law, allowing them to move forward with trafficking claims but rejecting claims based on distribution of child pornography. Congress made a “deliberate choice” to pass FOSTA with a limited carve-out, holding platforms liable only when they knowingly engage in "criminal" sex trafficking, and this lawsuit doesn’t meet that standard, the company said.