August 29th, 2024

Appeals court revives TikTok 'blackout challenge' death suit

A U.S. appeals court revived a lawsuit against TikTok regarding the death of 10-year-old Nylah Anderson, ruling that TikTok's algorithm does not qualify for Section 230 protection, impacting future liability assessments.

Read original articleLink Icon
Appeals court revives TikTok 'blackout challenge' death suit

A U.S. appeals court has revived a lawsuit against TikTok related to the death of 10-year-old Nylah Anderson, who died after attempting the "blackout challenge," a dangerous trend promoted on the platform. The Third Circuit Court of Appeals ruled that TikTok's algorithm, which curated content for Anderson's "For You Page," does not qualify for protection under Section 230 of the Communications Decency Act. This decision contradicts a previous ruling that had granted TikTok immunity from liability. The court emphasized that TikTok's algorithmic choices reflect editorial judgments, which could expose the platform to legal responsibility for harmful content it promotes. Judge Paul Matey noted that Section 230 should not create a "lawless no-man's land" for social media companies, while Judge Patty Shwartz highlighted that platforms engaging in content curation assume liability for the consequences of their actions. The case will return to the District Court in Pennsylvania for further proceedings. The ruling may have broader implications for Section 230 protections across social media platforms, as it sets a precedent that could challenge the extent of immunity these companies claim.

- The Third Circuit Court ruled TikTok is not protected by Section 230 in a lawsuit over a child's death.

- The decision highlights the responsibility of social media platforms for algorithmically curated content.

- The case will be re-heard in the District Court in Pennsylvania.

- The ruling may influence future interpretations of Section 230 protections for other social media platforms.

- The court's opinion suggests a shift in how liability is assessed for content moderation practices.

Link Icon 3 comments
By @noman-land - about 2 months
I think I totally agree with the appeals court here.

"In short, you can't have it both ways: Either you serve everything, let users sort it out and keep that liability shield; or you make algorithmic picks that surface content, give users what you think they want and take on the liability that comes with being the arbiter of that content."

Then again, is this not what a search engine does?

Then again, the search engine shouldn't have personalized results.

But some people want personalized results.

But children are not old enough to understand the consequences.

By @BlueTemplar - about 2 months
Yeah, this issue has been pointed out many years ago, nice for the courts to finally catch up !

I'd also note that it's a company controlled by an USA competitor (China) that gets hammered first (?) by an USA court. I wonder if USA platforms benefited from self-censorship by the courts ?

See also how, while theoretically USA infocoms have been made illegal in the EU nearly a decade ago (with the US state violating human rights and these infocoms being forced to help them), that ban still hasn't been enforced, partially because the relationships between the USA and the EU has mostly still been pretty good - instead we have seen a bunch of other laws pop up in the EU and screws being tightened only slowly and meekly.