July 26th, 2024

The Kids Online Safety Act and the Tyranny of Laziness

The Kids Online Safety Act aims to regulate content recommendation systems but raises concerns about censorship, free speech, and potential overreach in defining "harmful" content, complicating compliance for tech companies.

Read original articleLink Icon
The Kids Online Safety Act and the Tyranny of Laziness

The Kids Online Safety Act (KOSA) aims to regulate content recommendation systems on digital platforms, but its implementation raises concerns about censorship and the impact on free speech. The bill's authors suggest it focuses on design rather than content, yet the regulation of recommendation systems inherently affects the speech itself. Tech companies may resort to broad censorship to comply with KOSA, potentially targeting politically sensitive content and inadvertently suppressing important discussions, such as those related to mental health. The bill relies on the Federal Trade Commission (FTC) and a Kids Online Safety Council to define "harmful" content, which could lead to government overreach in content moderation.

KOSA's requirements may compel platforms to implement additional content moderation layers, resulting in practices like "shadowbanning," where content is deranked or hidden based on certain keywords. This could create a censorship system that stifles diverse viewpoints and important conversations, particularly for vulnerable users. The bill's vague definitions and reliance on government guidance could lead to over-censorship, as companies may err on the side of caution to avoid penalties.

Moreover, the challenges of accurately identifying harmful content complicate compliance, as algorithms struggle to discern context and nuance. KOSA's approach may ultimately exacerbate the very issues it seeks to address, suggesting that Congress should reconsider its content moderation provisions and engage in more thorough discussions with experts before moving forward.

Related

Can a law make social media less 'addictive'?

Can a law make social media less 'addictive'?

New York passed laws to protect children on social media: SAFE for Kids Act requires parental consent for addictive feeds and limits notifications; Child Data Protection Act restricts data collection. Debate ensues over enforceability and unintended consequences.

Protocols, Not Platforms: A Technological Approach to Free Speech

Protocols, Not Platforms: A Technological Approach to Free Speech

Internet platforms struggle with managing free speech, privacy, and disinformation. Criticism includes hate speech, censorship, foreign interference, and propaganda. Advocates propose using open protocols to empower users, foster competition, innovation, and privacy, and create new business models.

Canada Allocates $146.6M for New Censorship Commission

Canada Allocates $146.6M for New Censorship Commission

Canada allocates $146.6 million for a Digital Safety Commission to enforce the Online Harms Act, hiring 330 staff to regulate online platforms and combat hate speech. Critics fear infringement on freedoms and favoritism towards Big Tech.

DSA Ruling: ExTwitter Must Pay Up for Shadowbanning; Trolls Rejoice

DSA Ruling: ExTwitter Must Pay Up for Shadowbanning; Trolls Rejoice

The EU's Digital Services Act faces criticism for impacting shadowbanning. Recent court rulings highlight challenges for websites dealing with bad actors. Concerns arise over informing users about visibility restrictions, affecting traditional shadowbanning. Implications for online platforms in the EU are significant amid ongoing debates.

Senate to Vote on Web Censorship Bill Disguised as Kids Safety

Senate to Vote on Web Censorship Bill Disguised as Kids Safety

The Senate will vote on the Kids Online Safety Act, which aims to protect minors online but faces criticism for potential free speech restrictions and inadequate support for mental health resources.

Link Icon 1 comments
By @RevEng - 4 months
The worst part about this kind of censorship is that it allows a national body to decide what is appropriate for your child to see. Moderating what your child can engage with had always been a right and responsibility of a parent and every parent has a different view on what they want for their child. By making this decision centralized and applying it the same for everyone, it will inevitably become a lowest common denominator, sheltering our children from things that we may deem safe or even important. Just look at how often material on sexual education has been withheld from adolescents at the most important time of their lives to learn about it and how many times advice on mental health issues is hidden from those who urgently need it. Using key words like rape or suicide will have content removed or even result in banning your account on many platforms today without any understanding of the context in which those words are used.

I don't want the most prudent of the prudes to make laws removing all content they deem even slightly inappropriate from our global communication system, even if it's only for minors. This widespread banning of ideas is the very horrors of censorship that sci fi has warned us about for decades.