Facebook, Twitter’s handling of New York Post article raises election night concerns
Facebook and Twitter largely bungled their efforts Wednesday to limit the spread of a New York Post story about Hunter Biden, inviting allegations of censorship and raising concerns about the how they will handle a flood of information on election night.
Facebook was the first to take action, with communications manager Andy Stone saying the platform was applying its viral misinformation policy to limit the spread of the article and allow its third-party fact checkers to evaluate it.
But by that point the story had already been racing around Facebook, and Stone’s announcement did not seem to meaningfully limit the spread. By Thursday afternoon, it had been shared nearly 400,000 times on the platform, according to the Facebook-owned social media tracking service Crowdtangle.
Twitter approached the story differently by barring users from sharing links to it in tweets and direct messages, but without informing users that the company had determined that the article violated the platform’s policy on hacked materials.
CEO Jack Dorsey later said it was “unacceptable” how the company failed to explain its decision at the time.
The manner in which both social media giants handled such a divisive news story so late in the presidential campaign is raising questions about how tech companies will evaluate a high volume of information, and potentially misinformation, on Election Day.
Facebook has not made clear which aspect of the article led to the decision to limit its spread.
The company pointed to its policy on viral misinformation, which says that the platform can limit distribution before a third-party fact-checker evaluates a piece “if we have signals that a piece of content is false.”
Stone did not respond to requests for comment about what signals Facebook saw in this case, or how exactly the platform has limited the spread of the article.
Experts who spoke to The Hill about Faceook’s decision were split, both praising the move to slow the spread of suspect information and expressing concern about a failure to clearly explain the justification.
“It just struck a lot of people as odd and pretty unaccountable,” Evelyn Douek, an affiliate at Harvard’s Berkman Klein Center for Internet & Society, told The Hill.
“The overarching decision of like sending it to fact checkers and getting them to look at it, not censoring it, that’s all pretty normal,” she said, adding that the preemptive decision based on unnamed signals “didn’t make sense.”
The Biden campaign denied the main allegation in the New York Post article — that the former vice president met with Vadym Pozharskyi, an adviser to Ukranian gas company Burisma — and has pointed to numerous investigations that have all concluded there was “no wrongdoing” by the former vice president regarding Ukraine.
Twitter’s eventual justification for blocking the article link — as well as a second Post story that was also reportedly based on hacked emails — was much clearer than Facebook’s.
The platform’s hacked materials policy says that reporting on a hack is fine as long it does not include sharing personal information. The Post stories did not redact any information, leaving Hunter Biden and Pozharskyi’s email accounts visible.
But Twitter’s explanation came hours after it clamped down on spreading the story links.
“Our communication around our actions on the @nypost article was not great. And blocking URL sharing via tweet or DM with zero context as to why we’re blocking: unacceptable,” he tweeted Wednesday evening.
Regardless of the justifications behind the moves to limit the story’s spread, Facebook and Twitter’s intervention has kept the story in the news cycle and given Republicans a golden opportunity to pounce on their actions as proof of anti-conservative bias in big tech.
President Trump, who praised the New York Post for its article, accused the two social media platforms of seeking to help Biden’s campaign by censoring the news.
Sens. Josh Hawley (R-Mo.) and Ted Cruz (R-Texas), two of Congress’s biggest critics of big tech, sent letters out to the companies demanding an explanation. Hawley even went as far as to allege that the content moderation decisions amounted to an in-kind donation to the Biden campaign and should thus be investigated by the Federal Election Commission.
The Republican-controlled Senate Judiciary Committee has threatened to subpoena Dorsey and Facebook CEO Mark Zuckerberg next week.
The fumbling of the Post story is especially noteworthy because the general election is less than three weeks away.
“I think yesterday was a microcosm of what’s going to happen again and again in the next few weeks,” Doucek said. “The problem is when it looks like an ad hoc invocation of a policy or an ad hoc decision … it just opens up all of the conspiracy theories and all the charges of bias that we saw.”
Social media platforms have taken steps that aim to avoid a repeat of the 2016 election, where they were overrun with misinformation and attempts to sway the election.
One area that could be especially important is premature declaration of election results, since the coronavirus pandemic increases the odds that the final vote tally might not be known Nov. 3 due to the high volume of mail-in ballots that can’t be counted beforehand in some states.
Facebook and Twitter have both taken steps to address that possibility, pledging to label posts claiming early wins and directing users to updated vote counts. YouTube, however, has no such policy.
Doucek explained that having established rules is crucial to avoiding charges of bias.
Social media platforms have emphasized that they’re more prepared to tackle the election than they were four years ago. The next few weeks will be crucial in determining whether that’s true.
“What’s clear from this episode, so far, is that at least they’re awake, they’re not completely asleep at the wheel,” said Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights.
“But we’ll see over the next few weeks … whether they’re able to keep up with the flow of false information and whether they’re willing to take the risks that are necessary, when they know that taking action will bring sharp criticism from one side or the other.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.