YouTube is outpacing its social media rivals when it comes to curbing the spread of misinformation during breaking news events, while Facebook and Twitter are still struggling.
From this week’s bomb scares to news of a migrant caravan slowly making its way toward the U.S., misinformation and hoaxes have run rampant on most social media platforms.
{mosads}Facebook and Twitter have stumbled in the past to keep their platforms from being a clearinghouse for quickly disseminating unsubstantiated stories and armchair conspiracy theories that rapidly snowball in digital communities. But in the wake of major breaking news events this week, YouTube appears to have bolstered its safeguards for curtailing the rampant flow of misinformation.
Facebook and Twitter searches for terms related to the migrant caravan and bomb scares auto-populated on Wednesday with conspiracy results pushing the claim that Soros is linked to the caravan and that the bombs are a false flag.
Searching for “soros” on Twitter offered up the search suggestion “soros caravan,” while typing in “bomb” on Facebook yielded search suggestions such as “bomb false flag.”
Jonathan Albright, a professor at Columbia University who researches disinformation, said those search suggestions are particularly important because they can guide people who aren’t savvy news consumers or send people down rabbit holes of misinformation about current events.
“It’s about the process when you’re searching,” he said. “Suggested searches can shape what you actually end up with in the results.”
YouTube searches generally haven’t prompted users with such links. And when users search for conspiracies, most of the results are from vetted outlets on the center, left and right.
YouTube, which is owned by Google, said that’s the result of a concerted effort on its part to reduce the flow of misinformation on its platform, particularly during breaking news situations.
Albright said one possible solution for social media sites like Facebook and Twitter is to temporarily turn off suggested searches during sensitive situations.
“When a controversy starts to surface there should be a mechanism it turns off suggestions until the new dies off,” he said. “It would take a human curator, but with something like the Parkland shooting or Soros, misinformation works, in part, around those suggestions.”
YouTube didn’t explain how it handles sensitive suggested searches but noted that over the past year its push to prioritize authoritative news based on its ranking system has helped it prioritize vetted outlets over obscure, conspiracy-pushing users in its news results.
Aside from suggested searches, the results for such searches on Facebook and Twitter also returned videos pushing the conspiracy theories. On Wednesday, Facebook prominently displayed videos backing unsubstantiated claims underneath searches about the caravan and the bomb scares. Content from hoax peddlers was also easily accessible in Twitter searches.
Misinformation, particularly during breaking news situations, is not a new problem, but Albright noted that this week was definitely a low point.
“It’s bad,” he said. “It’s really bad.”
Despite the commitments by social media giants to reduce the spread of misinformation, their platforms still allow for the the spread of false claims, in part because of how they’re designed and the lack of efficient safeguards in place.
Parts of Facebook and Twitter have become Petri dishes for conspiracy theories about the origins of the migrant caravan and the bombs sent to prominent Democrats and CNN. Certain groups on those platforms have circulated the idea that the migrant caravan is being funded by hedge fund billionaire and prominent Democratic donor George Soros, who was among the targets of mail bombs this week.
Those same online communities pushed theories that the attempted bombings were part of a false flag plot to make Republicans look bad ahead of the Nov. 6 midterm elections.
Facebook and Twitter declined to comment when asked about misinformation on their platforms.
For its all improvement, YouTube and its parent company still aren’t perfect. On Thursday, autocomplete results on both YouTube and Google for “pipe bombs” featured “pipe bombs sent by democrats” among the suggested search options. Search results for conspiracy theories like “soros caravan” on YouTube still showed a small number of videos promoting misinformation.
For their part, Facebook and Twitter appear to have a growing understanding of the matter. The platforms cleaned up their search results on the caravan and bomb scares between Wednesday and Thursday, when searches that previously showed conspiracy as the top result instead showed results from Snopes and Factcheck.org. Search results no longer showed the conspiracy videos either.
But to some, those changes came too late. By Thursday, some of the conspiracy theories had already moved on to other mediums.
When this happens, the entire flow of misinformation can get exacerbated, Albright warned.
“They’re the people that help kind of create the bridge between the rumors that happen on the radar of social media and they help to legitimize the controversy,” he said.