YouTube on Thursday expanded its policies against hate and harassment to include removing content targeting an individual or group with a conspiracy theory like QAnon that has been linked to real-world violence.
The update comes less than a week after the platform’s CEO, Susan Wojiciki, came under criticism for declining to commit to an outright ban of the sprawling QAnon conspiracy theory, a step Facebook took last week.
“I think if you look at QAnon, part of it is that it’s a grass-roots movement, and so you could see just lots and lots of different people who are uploading content that has different QAnon theories,” she told CNN’s Poppy Harlow. “We’re very proactive in terms of removing it, and I think you’ll see us continue to be so.”
The new policy update is technically not a full ban, but a Google spokesperson told The Hill that the effect will be similar, saying that the vast majority of QAnon content on the platform would be affected.
Starting Thursday, videos that directly link any individual or group to QAnon could be removed. YouTube operates under a three strikes policy, where the first two violations result in a temporary freeze on posting and the third is grounds for banning an account.
QAnon’s supporters believe without evidence that Trump and his allies are working to expose and execute a cabal of Democrats, media figures and celebrities who are running an international child trafficking ring.
The theory has grown in scope over the last years, becoming functionally a metanarrative tying several unique conspiracies together.
Social media has been key to the growth of the QAnon community and continues to be used to loop in unsuspecting people by boosting campaigns such as #SaveTheChildren and co-opting anti-mask groups.
YouTube has emphasized the steps it has taken already to limit the spread of QAnon content, including changes to its recommendation system that it claims reduced views from non-subscribed recommendations to prominent QAnon-adjacent channels by more than 80 percent since the beginning of 2019.
It also claims to have taken down thousands of videos and hundreds of accounts tied to QAnon under existing policies. YouTube also promotes content from authoritative sources on generic searches about QAnon.
However, the site has continued to be a hub for QAnon content.
As always when targeting extremist content, implementation of the policy update will be crucial.
The QAnon community, especially the larger accounts that promote it, has already shown a willingness to adapt to new platform policies and evade moderation by changing their vocabulary.
The Google spokesperson told The Hill that the new policy would be applied retroactively.
That could significantly impact some of the biggest QAnon accounts on the platform that are used widely to indoctrinate people into the community.
“QAnon followers frequently cite Fall Cabal, Out of Shadows and Joe M’s ‘The Plan to Save The World’ as videos that radicalized them,” Travis View, co-host of the “QAnon Anonymous” podcast, told The Hill. “It would be a blow to QAnon recruitment efforts if those videos were pushed onto less popular video sites.”
These accounts were taken down shortly after publication of this story. More removals are expected throughout the day.
Facebook earlier this month committed to banning all accounts affiliated with QAnon, an escalation that has effectively reduced the conspiracy theory’s presence on the platform. While some groups and accounts have evaded removal, researchers have praised the take downs so far.
Following that announcement a series of platforms — Etsy, Triller and Peloton — have announced similar bans with varying success.
Twitter has removed thousands of accounts affiliated with QAnon from its platform and limited the reach of content about the conspiracy theory, but does not have a blanket ban.