The views expressed by contributors are their own and not the view of The Hill

QAnon proves internet companies aren’t up to the task of defending democracy

As the electoral drama unfolded on the evening of November 3, the nation held its breath. Civil society groups prepared for turmoil, journalists for rapid response and tech companies to stem the spread of disinformation. 

In the early hours of the morning, the networked factions that back President Donald Trump — disparate groups united by their support of the president — applauded his premature declaration of victory. Some turned to conspiracy theorists, operating in hives online, to make sense of the unfolding turmoil. Then they amplified the misinformation created in these spaces. 

One group associated with such conspiracy theories is QAnon, which has contributed to the spread of misinformation in the 2020 election. The QAnon movement is centered around an individual (or group), referred to as Q, who claims to be part of a secret U.S. intelligence operation, disseminating esoteric propaganda to encourage support for Trump’s imaginary crusade against forces of the so-called “deep state.” It originated from the 4chan, migrated to 8chan, then found a home on 8kun, which are message boards designed to share memes and anime — not foster extremism. But their characteristics made them attractive homes for groups ranging from the hacktivist Anonymous collective, the reactionary Gamergate movement to white supremacist terror. They also have been a home for anti-democratic speech and celebrating political violence.

The growth of the QAnon conspiracy is the work of media manipulation by a small group of motivated actors, who move the storyline along across networked platforms. Like networked social movements that have used the internet as an advocacy platform, QAnon followers have managed to create a resilient cross-platform ecosystem of content and influencers that has shuttled misinformation across its various hubs for the last three years. Eventually Trump, who QAnon followers largely support, acknowledged and tacitly defended the conspiracy. As 2020 has shown us, political representation is on the horizon — several Q candidates were on ballots across the country, including Marjorie Greene, who won a seat in the House, and Sen. Kelly Loeffler (R-Ga.), who is facing a runoff election to retain her seat in the U.S. Senate.

In 2020, the limited data we have from polling and critical reporting suggests millions are now aware of and may be on board with this movement. QAnon has become a fully networked conspiracy complex with numerous entry points for new followers, such as breaking news events, celebrity gossip and political intrigue. The movement uses pseudonymity to avoid attribution on social media, distributed amplification to quickly spread disinformation and fostered by fringe “alt tech“ platforms like 8kun and Gab. Sheltered by these platforms known to harbor extremist groups, QAnon punches well above its weight, affecting our media, democratic institutions and public health.

While QAnon is not the alt-right, both movements grew in the same places. QAnon first came to popular attention when its supporters became visible at Trump rallies, and it spread globally during COVID-19 lockdowns. Steeped in ancient antisemitic tropes, QAnon members engage in misguided research, networked harassment of politicians and blind support for Trump. They are not the originators of these conspiracy theories, but the amplifiers often look to Trump himself for tacit recognition, and they rely on social media to grow their ranks. 

Social institutions around the world are struggling with anti-democratic movements weaponizing social media. A few people can rapidly deploy disinformation across networks to deadly results QAnon was initially spread by three conspiracy influencers before it was taken up on large platforms. This network of influence is much like fandoms, and mimics the form of activist groups. We see how these methods were used to deadly effect by white nationalists.

After Charlottesville in 2017, platforms finally removed many of the extremists who used their systems to organize the deadly Unite the Right Rally. QAnon, unlike the alt-right before it, is not focused on ethnonationalism, but rather the acceleration of civic decay in the form of political and medical disinformation, including vaccine hesitancy. While the vast majority of QAnon influencers and believers do not advocate violence, some have taken matters into their own hands

Just as QAnon co-opted the fight against human trafficking with the #savethechildren hashtag, the movement isn’t bound to the 2020 elections. On their dock now is Agenda 21, the belief global leaders are plotting a depopulation genocide to favor elites. 

How are platforms responsible

The manipulation of social media by unknown actors fundamentally disrupts democratic communication. This lack of identity leads to lack of attribution, leaving our political communication in the so-called “new public square” of social media vulnerable to both domestic and foreign interference. As power and wealth is consolidated around these platforms, they show us time and time again they are unable to successfully mitigate these campaigns. Now, as we see the impact on electoral politics, we must consider the true cost of disinformation and brace for its continual impact on our democractic institutions long past the elections. Internal leaks from facebook suggest the movement was allowed to grow, unfettered, for far too long despite internal concern.

What can be done

In the chaos that exists between breaking news and verified information, disinformation thrives. Most recently, two individuals associated with the QAnon movement were arrested in Philadelphia for an alleged plot to attack the site of ballot counting. As liberals call for regulation, and conservatives rally around the abolition of Section 230, which governs liability on platforms, we cannot lose sight of what is at stake. Coalitions, like Change the Terms, have long worked to hold platforms accountable by creating model policies on hate speech. While debates about content moderation are about QAnon and Trump right now, it will not always be. The enduring influence of QAnon on political communication is a symptom of how social media  platforms remain unable to adapt to evolving use cases, and the only way to counter it is to recalibrate how platforms moderate content, especially conspiracy and medical misinformation. 

Brian Friedberg is a senior researcher of the Technology and Social Change Research Project at the Shorenstein Center on Media, Politics and Public Policy at Harvard Kennedy School. Merging academic methods and Open Source Intelligence techniques, he is an investigative ethnographer, focusing on the impacts alternative media, anonymous communities and unpopular cultures have on political communication and organization.