The views expressed by contributors are their own and not the view of The Hill

When disinformation becomes a political strategy, who holds the line?

Getty Images

Even as the Electoral College affirmed Joe Biden’s victory, President Trump and his allies continue to push unfounded claims that the election was stolen. While their disinformation campaign didn’t overturn the results, it has sown distrust among voters. A recent poll found that only one in five Trump supporters believe the election is settled and Biden is the legitimate winner.

When elected officials use disinformation as a political weapon, our society’s information intermediaries – primarily news media outlets and social media platforms – are one of the first lines of defense. Here’s what my colleagues and I who study disinformation at Harvard’s Berkman Klein Center for Internet & Society have learned about how they can respond effectively.

Disinformation as political strategy

Our research at the Berkman Klein Center finds that disinformation is increasingly being used by political leaders as a calculated strategy to shape public narratives and manipulate voters. A recent study by my colleagues, led by Yochai Benkler, found that the myth of mail-in voting fraud was disseminated from the top down, starting with President Trump and GOP leaders and trickling through established media outlets. 

The study’s analysis of media stories and social media posts on mail-in voting revealed that “Fox News and Donald Trump’s own campaign were far more influential in spreading false beliefs than Russian trolls or Facebook clickbait artists.” These efforts were among a wave of disinformation pushed by legitimate political actors during this election, from fake ballot drop boxes to false reports of ballot tampering. 

Stopping the flow of disinformation

When disinformation is promoted by political leaders, it raises a thorny set of questions about how to respond. This fall I co-authored a series of exercises to help four key actors – media outlets, social media companies, the national security community and elections administrators – better understand their respective roles in containing election-related disinformation. Two critical lessons from the exercises are: (1) disinformation rapidly cascades across the information landscape when public figures promote it, and (2) media coverage and social media content moderation most directly influence how disinformation spreads. 

This election, media and tech platforms demonstrated a greater understanding of their critical role as conduits in chains of information and how their actions can shape the ability of other actors to respond. For example, media exercised caution as they made projections in the days after November 3rd and generally waited for indications from state election administrators before calling races. This helped local officials maintain their rightful authority over the results in their states and preempted false narratives that the media was attempting to influence the outcome. 

How information intermediaries can help

The Trump presidency brought the use of disinformation as a political strategy to new heights. While responding is a society-wide, multi-stakeholder challenge, media and tech platforms play an essential part in ensuring disinformation doesn’t overtake our public discourse.

First, they must stop assuming that users can always successfully separate fact from fiction, especially when it comes from those in positions of power. For example, warning labels are an increasingly popular strategy for handling disputed information from public figures. But there is little evidence to indicate they are effective in stopping the virality of disinformation. According to Twitter, 74 percent of people who viewed election tweets with a warning saw them after it had been applied, suggesting that the claims in these posts continued to spread despite labels.

Labels and disclaimers alone aren’t enough. They need to be paired with an aggressive push to build public resilience to false narratives by blocking disinformation and countering it with the truth. For instance, several television networks cut away from Trump’s November 5 press conference once it devolved into a slew of false claims. At Fox News, the decision desk stood by its choice to call the race for Biden and anchors pushed back against pundits and guests seeking to delegitimize the results. 

While exercising stronger editorial control over leaders’ statements can often be politically sensitive, doing so empowers the public to make better decisions about the truth of what they see online. Media outlets and tech platforms can counter potential criticism by developing clear policies on how to handle misleading claims from public figures and enforcing them consistently.

Clearer policies can help information intermediaries act quickly, decisively and transparently. YouTube and Facebook were both criticized for not moving rapidly enough to take down election-related disinformation, such as news-style videos falsely claiming Trump won. Facebook deleted several Stop the Steal groups, but not before members staged protests outside ballot counting locations, a concerning example of how online disinformation spills over into the offline world. 

A turning point

Our societies are facing a crisis of truth as people increasingly distrust experts and have trouble determining what sources are authoritative. Relying only on users’ ability to distinguish fact from fiction shifts responsibility from intermediaries who have the capacity to exercise stronger editorial control over false content to users who are not always equipped to engage with the sheer volume of misleading information online. 

The 2016 election woke us up to the threat of foreign interference in our political processes. 2020 demonstrated the danger posed by homegrown disinformation, especially from political leaders. When our leaders spread disinformation without sufficient pushback, our relationship to knowledge and truth continues to erode, as does our democracy. Media companies and tech platforms must continue to mount stronger responses to ensure that disinformation from our own leaders does not overtake our politics.

Oumou Ly is a staff fellow at Harvard University’s Berkman Klein Center for Internet & Society working on the Assembly: Disinformation Program and hosting their web series The Breakdown. She is an expert on best practices for countering disinformation and how institutional structures promote or mitigate disinformation.

Tags 2020 presidential campaign Communication Disinformation Donald Trump Facebook Fox News Joe Biden Media manipulation Propaganda techniques Russian interference in the 2020 United States elections Twitter

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.