Fringe social networks boosted after mob attack
Fringe social media networks are seeing their user bases swell in the aftermath of last week’s insurrection at the Capitol building and the subsequent banning of President Trump and some of his loudest supporters from Facebook and Twitter.
That migration raises concerns from experts that otherwise peaceful supporters of the president are moving into close proximity with extremist groups that congregate in those spaces.
Twitter last week took the unprecedented step of permanently banning the president. It also cracked down on accounts sharing conspiracy theories, removing 70,000 accounts over the span of a few days including most remaining major QAnon promoters.
Facebook also suspended Trump until at least the inauguration of President-elect Joe Biden and committed to removing all content associated with “Stop the Steal,” one of the major movements pushing the false belief that the general election was rigged.
The push by mainstream platforms to weed out accounts pushing the disinformation narratives may be too late to slow the spread of conspiracies, according to experts.
“There are dozens of sites sitting around and already built that are ready to welcome in whatever waves of people are moving from platform to platform,” Jared Holt, a visiting research fellow at Digital Forensic Research Lab told The Hill. “Cracking down on platforms is undeniably a good way to minimize the spread of extremism outside of its own echo chamber, but it doesn’t make extremism go away.”
Alternative social media platforms have seen huge traffic spikes in the days since these crackdowns.
Parler was one of the largest beneficiaries. Between Jan. 6 and 10, the app was downloaded from American app stores 825,000 times, according to data from Sensor Tower shared with The Hill.
The social media platform, which markets itself as a free speech haven, was removed from both the Apple and Google app stores and ultimately pulled offline by Amazon Web Services shortly after that surge. However, Parler has already transferred its domain to a new web hosting service, Epik, meaning it could soon be back online.
Gab, which has pitched itself similarly to Parler, has been largely unusable since the mob stormed the Capitol, even as new users flood in. The app has not been available from Apple or Google’s stores since 2017 over hate speech violations.
Researchers have also seen users migrate to less mainstream platforms like MeWe and CloutHub.
Between Jan. 6 and 10, MeWe — a right wing alternative to Facebook — was installed 402,000 times from U.S. app stores, a significant increase from the 38,000 installs of the app the week before, according to Sensor Tower’s data.
CloutHub’s daily downloads surged from roughly 9,000 on Jan. 6 to more than 60,000 — the all-time high for the platform — just five days later, according to estimates by mobile intelligence provider Apptopia.
Privacy-centric platforms Telegram and Signal have also seen big upticks in activity. Telegram was downloaded 857,000 times between Jan. 6 and 11, up more than 200 percent from the year prior according to Sensor Tower.
The platform, which allows users to join channels that serve almost like news feeds, appears to have gained an additional bump from Parler being taken down. One channel dedicated to users leaving the platform already has 16,000 members.
Signal, the encrypted messaging services, has skyrocketed in popularity. There are some exogenous factors in that case though, chiefly Tesla CEO Elon Musk shouting out the service following privacy policy updates at Facebook’s WhatsApp.
Deplatforming remains a hotly contested debate among academics and social media onlookers.
Removing people like the president or Trump-supporting attorney Lin Wood from major platforms has the immediate effect of reducing their reach. Users may be reluctant to follow them onto other sites because of things as simple as having to download and get the hang of new platforms.
It also decreases the risk of otherwise uninterested users from stumbling onto conspiracy content while scrolling through Facebook or Twitter.
However, it does run the risk of making the platforms that shunned users flock to more toxic, according to Ethan Zuckerman, director of the Center for Civic Media at MIT.
The migrating users may also be exposed to more dangerous groups that are already on the apps because of their relatively lax content moderation.
Many supporters of QAnon, who believe Trump is working to expose a cabal of child-eating elites in Democratic politics and the media, have migrated to Telegram, according to far right researcher Marc-André Argentino.
“That’s a little more concerning in the sense that they’re going to be closer to a little bit of the violent extremist movements that are looking to recruit them,” he cautioned.
Some of the more dangerous far-right groups on Telegram have already been discussing targeting channels with newcomers from other platforms for fresh recruitment.
Extremism also tends to be more difficult to track on these alternative apps, at least for now.
“There’s thousands and thousands of academics who are watching Facebook and Twitter and it’s probably a dozen watching the ‘alt-tech’ space,” Zuckerman said.
Google and Apple do not have any immediate public plans to ban the other apps in this story from their stores.
Asked about MeWe, CloutHub and Telegram, Google said apps available on its app store that contain user generated content must implement robust, effective and ongoing content moderation. But a spokesperson declined to provide specific details as to why those apps, which are rife with violent posts, are allowed to remain on the app store.
A spokesperson for Apple did not respond to a request for comment.
As of now, these newly popular platforms are welcoming new users from major social media sites and being used to plan potentially violent manifestations ahead of and on Biden’s Inauguration Day.
But removing the fringe platforms and cutting off the dissemination of disinformation and extremist content is just the first step toward eradicating the online conspiracies and limiting future risks of them leading to real-world violence, according to experts who spoke to The Hill.
“I think a level of triage has to be factored into the approach,” Holt said, comparing deplatforming to stopping the immediate bleeding of a wound. “Once that’s done and contained then you have to take a more intense look at the actors involved and the systemic ways disinformation and extremism is allowed to thrive on the internet.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.