The views expressed by contributors are their own and not the view of The Hill

Sens. Warren and Graham propose a bipartisan overreach on Big Tech

Getty Images

On July 27, Sens. Elizabeth Warren (D-Mass.) and Lindsey Graham (R-S.C.) announced legislation to establish a new agency responsible for policing major tech companies. The Digital Consumer Protection Commission Act aims to regulate Meta, Google, Amazon and other Big Tech firms with strong penalties for violations that include revoking these companies’ licenses to operate. “It’s time for meaningful, structural change to rein in Big Tech,”the senators wrote in a press release

This is a powerful moment. Who would have thought that Sens. Warren and Graham could sit down in a convivial manner and collaborate on joint legislation? That these polar opposite leaders have come together to put forth a bipartisan effort demonstrates that aversion to Big Tech is universal, and that our democracy still functions. Unfortunately, the legislation they’ve posited is likely destined for failure. Here are four reasons why:

First, the act is extremely wide-ranging, covering a plethora of seemingly disparate issues. Not only does the measure require the creation of an entirely new federal agency — a Herculean task in itself — it also lays out numerous mandates covering fair competition and antitrust; protections for kids; data privacy and collection transparency; content portability; content moderation; and even national security.

While American citizens and most members of Congress may agree on solutions to several of these issues, such as content portability and transparency, there are chasms of disagreement among other items in the bill. Bundling in a cluster of disparate directives that are politically controversial will prevent Congress from passing the common ground components of this legislation.

Second, there are already existing agencies and legislation that tackle many issues the legislation addresses. The Federal Trade Commission and the Department of Justice are responsible for regulating monopolistic and anticompetitive practices. The Federal Communications Commission already regulates Big Tech and has oversight protecting young users. In areas that need beefing up such as data privacy and collection transparency, new laws can be passed giving a broader purvey to established regulators. Creating an entirely new agency is both inefficient and polarizing politically. 

Third, the act includes provisions that interfere with the discretionary rights of private companies and are moderation nightmares. It puts a federal agency in the middle of debates over posts flagged by users for violating a company’s terms of service. This will predictably lead to millions of complaints and appeals for the government to decide.

Fourth, the measure holds companies accountable for content on their platforms and requires them to “mitigate the heightened risks of physical, emotional, developmental, or material harms”. Due to liability concerns, companies could preemptively spy on their users and censor anything and everything that could possibly be deemed to cross the line.

The act declares that it will publish “best practices” for content moderation policies that sites can adopt. The government should not dictate moderation guidelines for private companies to follow, especially while threatening to financially penalize or even revoke the licenses of companies who don’t comply with the commission’s directives.

Let’s bring some common sense here. Other than content that’s clearly lawbreaking, platforms should set their own moderation policies without fearing government retribution. Users can then choose to move to platforms that match their values and tastes, as they do today. As an aside, the act’s requirement for content portability between platforms is a great idea, and again much better addressed in narrow legislation that would likely be supported by both sides of the aisle. 

A final note. The act ignores the elephant in the room, Section 230. This law, enacted in 1996, ensures fair competition by shielding all web companies, big and small, from liability for user generated content. As intended, it gives startups the liability protection they need to compete against established giants.

Section 230 is currently under attack by leaders in both political parties for giving companies too much control over moderation decisions. The best compromise here is to protect Section 230, while making careful reforms to it. One example is holding companies accountable for harmful content they actively promote and target their users with. 

The act set forth by Warren and Graham has very good intentions. It addresses many significant problems posed by Big Tech firms and signals the powerful bipartisan desire to fix them. But creating a new omnibus agency that overlaps with several existing agencies isn’t the best prescription. 

Mark Weinstein is a world-renowned tech thought leader and privacy expert. He is the founder of the social network MeWe, which he left in July 2022, and is currently writing a book on the intersection of social media, mental health, privacy, civil discourse and democracy.

Tags Elizabeth Warren Lindsey Graham

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Most Popular

Load more