Policymakers should promote digital services competition for social media users
Former President Trump hinted this month that he may launch his own social media network. The news would no doubt be welcomed by supporters, and comes as newer social media offerings like Rumble, Clubhouse and MeWe are thriving. The variety allows users to gravitate towards platforms with features and content policies they like.
One of the laws that has allowed social media companies and other digital services to flourish is Section 230 of the Telecommunications Act. Under this law, digital services can take action against bad actors when they become aware of actions that violate their terms of service, and work closely with law enforcement when appropriate.
Section 230 is what enables any digital service to both host free speech online and address violent, offensive, obscene, dangerous or extremist content posted on their service, without the risk that they’ll be sued for their efforts to stamp out bad actors.
However, this same provision that has enabled new social media companies to enter the market and grow is now under attack both on Capitol Hill and more than 30 statehouses across the country, due to some policymakers’ concerns that either too much or too little content is being taken down.
On Thursday, members of the House Energy & Commerce Committee questioned the CEOs from Facebook, Google, and Twitter about the decisions their companies make in moderating online content, including everything from taking down disinformation and hate speech to banning anti-American extremism in posts.
There have been ten bills introduced so far this Congress to limit protections that allow companies to host user content online, with more members of Congress signaling that additional proposals are forthcoming. Several would allow anyone from foreign actors spreading disinformation to racists spewing hate to sue a company that enforces their user agreement against this type of content.
The co-author of Section 230’s important legal protections, now-Sen. Ron Wyden (D-Ore.), has noted that Section 230 is what enables free speech. Without it, movements like #MeToo and #BlackLivesMatter might have been silenced because online platforms could remove the content to avoid getting sued. Changes to Section 230 could therefore disproportionately harm marginalized groups. For example, social media platforms might have faced litigation for hosting viral videos of police violence that spurred nationwide protests for racial justice if they faced constant litigation from law enforcement for doing so.
Established platforms may be better positioned to manage plaintiffs’ lawsuits. But smaller companies would be driven out of business if these bills open the courthouse doors to bad actors to sue their way back onto platforms that have turned them away.
An equally significant threat to small digital services are the various content moderation bills currently before 30 state legislatures. Many of them would require companies to assign to humans reviewers tasks routinely triaged by automated processes when enforcing acceptable use policies.
A Florida bill allows an internet user to request detailed information on why a post was removed and a Utah bill allows a further appeal to a human moderator. Legislation in Texas that was reported favorably out of committee this week would allow user appeals and require companies to have an employee for state residents to call.
By CCIA’s estimates, mandating human review of appeals, like Utah, could cost services an additional 34 cents to $3.50 per user per year depending on factors such as rate of appeal and whether the appeals are duplicated in other states. Even the lower end of these costs could be ruinous for small firms. The high range of these costs would be a financial burden for all platforms since leading platforms only have global average revenue per user of about $10 annually.
As Congress and state legislatures consider the landscape of free speech online and how to encourage more voices, it is worth remembering that our current laws are what allowed newer social media services to exist. Current legal rules like Section 230 are what gives digital services the ability to both protect users from dangerous or harmful content and allow a wide range of voices and viewpoints to be heard.
Some say companies have done too little and others too much in refusing to host harmful content like hate speech. But for either side, promoting new entry by startups to compete for social media users should be the right path forward. Taking away legal protections and adding costly regulations will unfortunately result in more lawsuits, fewer social media startups and online platforms, and ultimately less opportunities for users to be heard.
Arthur Sidney is Vice President for Public Policy for the Computer & Communications Industry Association
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.