For platform regulation Congress should use a European cheat sheet
Last year was a crazy year for internet platform regulation. Congress introduced over 20 separate bills to amend a key U.S. internet law: Communications Decency Act Section 230, better known as CDA 230. CDA 230 protects platforms ranging from Facebook to online retailers and hobbyist blogs, largely shielding them from liability for user content and giving them leeway to moderate that content.
While CDA 230 has attracted criticism from across the political spectrum, many of the 2020 “reform” bills were arguably some combination of unserious, undemocratic and unconstitutional — and ultimately had little chance of success. President Trump added to the chaos with a dramatic gesture in December, vetoing essential military spending because Congress did not meet his demands to simply eliminate CDA 230.
It’s a new year. It’s time to get serious. Platform regulation has a real shot at passage this cycle — for better or for worse. Badly designed laws will have direct consequences for ordinary internet users and the U.S. economy. It’s not an area where lawmakers should shoot from the hip.
Fortunately, they don’t need to.
One way to understand the realistic alternatives to CDA 230 would be to study other existing legal models, as well as facts about platform content moderation. The material my students study at Stanford Law School would be a great foundation for that. But U.S. lawmakers can also do something I’m sure my students would never do: crib answers from the people who already did the work.
For platform regulation, many of the world’s experts today are in Brussels. European civil servants have been putting serious effort into reforming Europe’s rules, going back to at least 2012. The European Commission’s 2020 consultation on online content alone netted some 300 formal position papers, and thousands of other comments. The commission’s hard work shows in the major new legislation it introduced in December, the Digital Services Act (DSA).
Congress could learn a lot from the DSA, as several experts have pointed out. Of course, not everything in its 113 pages is a good idea and not all of it fits with U.S. policy goals. One major thing is that it tells platforms to remove speech the U.S Constitution protects. And given the U.S. internet economy, lawmakers here are unlikely to embrace all the tradeoffs the DSA strikes between regulation and innovation. But plenty of the ideas in the DSA are common sense. We can use them to improve Washington’s current free-for-all CDA 230 debate. Here are three of the big ones.
Not all internet companies are the same.
U.S. law generally gives platforms of all sizes the same legal immunities. If Congress replaced those blanket protections, a one-size-fits all approach could be devastating for the numerous small companies that depend on CDA 230. They can’t afford to hire armies of moderators or fund years of litigation, like Facebook and Google (where I used to work). And most don’t pose the same risks or exert the same power that the giants do. The DSA recognizes these differences. It spells out obligations based on size, including special rules for “very large” platforms. That metrics-based approach isn’t ideal, but making rules that vary by company size is vastly better than subjecting the whole internet to rules designed for today’s behemoths.
Different rules can make sense for different internet functions, too. Laws designed for an online craft fair like Etsy are unlikely to make sense for an infrastructure provider like Amazon Web Services or Comcast. That’s in part because intermediaries deeper in the internet’s technical “stack” often can’t take down just one post or comment — they can only act against entire sites or apps. EU lawmakers understand these distinctions, and the DSA provides different rules for different technologies.
Lawmakers should use the right legal tools for the job.
The EU has one major law governing privacy online, it’s creating a second law to set competition rules for “gatekeeper” platforms, and the DSA will govern platform responsibility for online speech and content. American lawmakers and pundits, by contrast, keep trying to tackle speech, privacy and competition all at once. The results are predictably incoherent.
At the competition/speech intersection, lawmakers like Ted Cruz berate Silicon Valley CEOs as monopolists, then support changes to CDA 230’s speech rules that would firmly establish the industry’s largest players, while squeezing out their smaller competitors. At the privacy/speech intersection, some lawmakers try to leverage CDA 230’s immunities to incentivize companies to adopt better privacy practices. That garbled approach wouldn’t be necessary if Congress enacted basic, long-overdue privacy protections for internet users. If the concern is about platforms’ gatekeeper power over online businesses and discourse, antitrust and competition laws can provide the right tools for enforcement. Issues like housing discrimination and ad targeting could similarly be addressed without involving CDA 230. Congress should, like EU lawmakers, identify which problem it wants to solve. Then it should use the right tools for the job.
Don’t ignore the real-world problems with content moderation.
Europe has long had laws requiring platforms to take down illegal content that they discover or are notified about. That created some real-world problems, which the DSA sets out to correct. In practice, platforms facing poorly defined legal obligations often take down much more speech than lawmakers intend. Among other problems, they honor takedown demands from abusers who submit false legal accusations — including governments pressuring platforms to erase evidence of police brutality, priests hiding evidence of child abuse and businesses trying to harm competitors.
Large parts of the DSA are explicitly dedicated to fixing these problems. Among other solutions, it creates new protections for users, including appeal mechanisms — not just to platforms, but to independent arbiters and courts. That kind of after-the-fact judicial review might not be the right fit for U.S. speech and due process norms. But U.S. lawmakers could use courts to define takedown obligations in the first place. That’s what Sens. Brian Schatz (D-Hawaii) and John Thune (R-S.D.) did in the PACT Act, the most thoughtful of the 2020 CDA revision bills.
The point for this year’s crop of CDA 230 reform bills is that seemingly simple rules can have serious unintended consequences. The EU experience tells us a lot about pragmatic options to at least try to correct them. It would be ironic if the U.S., in its rush to change CDA 230, missed that lesson and embraced the same flawed rules that Europe is now abandoning.
At the end of the day, the U.S. and EU have a lot to learn from each other.
We aren’t all going to agree on some basic things such as which speech should be legal, or how to balance innovation and competition against other societal goals. But differences between American and European approaches shouldn’t prevent us from finding common ground on other functional aspects of platform regulation. The reasons to seek alignment are powerful, whether as an economic measure to streamline consumer protections or a human-rights-based effort to counteract China’s increasing influence on internet policy. After years of hard work, European lawmakers have come up with some good ideas. U.S. lawmakers who are serious about changing CDA 230 should look at them.
Daphne Keller directs the Program on Platform Regulation at Stanford’s Cyber Policy Center. She was previously associate general counsel for Google.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.