Court ruling prompts fears of ‘Wild West of disinformation’

An order limiting the Biden administration’s communication with social media companies could make it harder to curb disinformation as the 2024 election nears.

A federal judge Tuesday curtailed communication between certain Biden administration agencies and social media companies after a GOP-led challenge to efforts to combat disinformation, arguing attempts to do so violated protected speech.

The ruling left experts concerned about a “chilling effect” on attempts to moderate false information online.

“If we end up with basically no meaningful content moderation, then it is going to be a Wild West of disinformation,” said Darrell West, a senior fellow at the Brookings Institution’s Center for Technology Innovation.

Two Republican state attorneys general argued that the Biden administration “coordinated and colluded with social-media platforms to identify disfavored speakers, viewpoints, and content.” The result, they said, was a “campaign of censorship” executed by the administration.  

U.S. District Judge Terry Doughty, a Trump appointee, ruled in their favor, ordering that Biden administration officials cannot contact social media companies relating to “in any manner the removal, deletion, suppression, or reduction of content containing protected free speech posted on social-media platforms.”  

Officials from the Department of Health and Human Services, the Centers for Disease Control and Prevention, the Department of Justice, the State Department and the FBI were told to cut those communications with the companies.  

The case had primarily taken aim at attempts to curtail disinformation during the COVID-19 pandemic, which Republicans decried as a violation of the First Amendment.

White House press secretary Karine Jean-Pierre said Wednesday that administration officials disagree with the court decision.

“Our view remains that social media platforms have a critical responsibility to take action or to take account of the effects of their platforms,” Jean-Pierre said, adding that the administration will “continue to be responsible in that way.” 

The Justice Department filed a notice of appeal Wednesday evening and expects to request a stay of the district court’s decision, a department official said. That appeal will next go to the 5th Circuit Court of Appeals, which is primarily packed with GOP-appointed judges.

Alice Marwick, principal researcher at University of North Carolina at Chapel Hill’s Center for Information, Technology and Public Life, said the ruling perpetuates a narrative that cracking down on disinformation — false information meant to mislead — is code for government suppression.  

“This [ruling] is part of a political ploy to change the meaning of disinformation from information that’s incorrect and harmful to a sort of political slur — the idea that labeling something disinformation is tantamount to censorship,” Marwick said. “What this ruling does is it really continues that narrative.” 

The lawsuit, brought by the attorneys general of Louisiana and Missouri, has seen support from Republican lawmakers and public figures who say social media platforms moderate right-wing content at greater rates than on the left.

Sen. Eric Schmitt (R-Mo.) called the order a “huge win for the First Amendment and a blow to censorship.” 

It’s not likely the ruling will have an immediate impact on online disinformation, Marwick said. But over time, social media platforms might choose to limit their moderation efforts out of fear of legal and political ramifications. 

In the years that have followed the COVID-19 pandemic and 2020 presidential election — events during which misinformation and disinformation exploded online — social media companies have already begun taking less stringent approaches to moderating content on their platforms.  

YouTube announced last month that it would stop removing content that perpetuates false claims of “widespread fraud, errors, or glitches” in the 2020 or other past U.S. presidential elections. Meta also said last month that it would roll back a policy meant to curb misinformation related to COVID-19, and in November, Twitter said it would not stop users from spreading false information about the virus and its vaccines.  

The trend away from strictly moderating false content could make it easier to spread disinformation surrounding the nearing 2024 election, West said.  

“If you take this ruling in conjunction with decisions that major platforms already have taken, it just creates a huge problem for 2024,” he said. “I think we’re going to face a tsunami of disinformation.” 

Samir Jain, vice president of policy at the Center for Democracy and Technology, said an expected “explosion” of disinformation generated by artificial intelligence as the presidential election nears exemplifies the risks posed by decreased communication between government officials and social media companies.  

“It’s unclear, in the wake of this order, the extent to which a government can alert social media providers to intelligence it’s seeing or information that it has about particular kinds of trends and misinformation and disinformation,” he said.  

Administration officials were not barred in all cases from communicating with social media platforms. The order indicates that Biden administration officials can confer with platforms about criminal activity, national security threats, threats to public safety and posts “intending to mislead voters about voting requirements and procedures.” 

But even with those exemptions, the order’s sweeping directive leaves room for interpretation.  

“This is a very broad-based decision,” West said. “And it actually, I think, marks new territory in terms of First Amendment rights.” 

Tags Joe Biden Joe Biden Karine Jean-Pierre Social media Supreme Court Terry Doughty White House

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.