Technology

Tech groups urge Congress to ‘dig deeper’ on Facebook role in Capitol riot

Tech accountability groups are urging members of Congress to “dig deeper” into the role Facebook played in leading up to the Jan. 6 riot at the Capitol ahead of Tuesday’s House hearing about the attack, according to a report shared with The Hill on Monday. 

The groups are sending the report, composed of publicly available information and the groups’ previous findings of how Facebook was used ahead of the riot, to House and Senate leadership offices, as well as members of the House select committee formed to investigate the attack.

“These facts lead to an obvious conclusion: Facebook bears significant responsibility for the events that transpired on January 6th. The Select Committee should use their investigatory powers to dig deeper on what happened on the platform leading up to the insurrection, including behind the scenes to determine who knew what and when, in order to make sure the entity is held accountable for their role in the insurrection,” the report states. 

The renewed push from the groups, Accountable Tech, the Institute for Strategic Dialogue (ISD), Media Matters and the Tech Transparency Project, comes the day before the House select committee holds its first hearing and as the federal government continues to clamp down on the spread of misinformation on social media platforms. 

Facebook spokesperson Andy Stone defended the platform’s policies put in place to respond to election disinformation and the riot at the Capitol. 

“As we’ve said repeatedly, including before Congress, our teams were vigilant in removing content that violated our policies against inciting violence leading up to January 6th. We were prepared for this and have been more aggressive than any other internet company in combating harmful content, including content that sought to delegitimize the election,” Stone said in a statement. 

“We banned hundreds of militarized social movements, took down tens of thousands of QAnon pages, groups and accounts from our apps, removed the original #StopTheSteal Group, labeled candidates’ posts that sought to prematurely declare victory, and suspended former President Trump from our platform for at least two years.”

The report includes an analysis of previous research from the ISD about the spread of disinformation about false claims of voter fraud during the 2020 election and the amplification of those false claims by a “small but influential cluster of accounts.” 

When former President Trump lost the election, the “Stop the Steal movement quickly emerged as the vehicle for these frustrations,” the report states. 

The report from the accountability groups also accused Facebook’s pledge to remove election-related misinformation after the riot of being too narrow and allowing much of the disinformation to remain on the platform. It includes reports from Media Matters citing posts spreading such false claims from prominent accounts, including members of Congress, that were not removed. 

Along with the report, Accountable Tech is launching a digital ad campaign urging “real regulation” of Facebook. The narrator of the 47-second ad accuses Facebook of a business model that “incentivizes radicalization” and “inflames hate and disinformation.” 

Facebook, along with other social media giants, has faced increased pressure to weed out disinformation about the election — as well as the coronavirus pandemic — in recent months. 

Facebook has pushed back on criticism of its handling of coronavirus misinformation, as well, and touted its policies put in place to connect users with authoritative information about the virus and vaccines. 

Democrats have been urging platforms to crack down on the spread of false content. President Biden amped up the call to remove disinformation, specifically about COVID-19 vaccines, with a recent advisory from the surgeon general that called the spread of health misinformation an “urgent threat.” 

Tech critics on both sides of the aisle have used the push to call for reform of Section 230 of the Communications Decency Act, which provides tech companies as a liability shield for content posted by third parties. But any action on such proposals will largely be stalled, as lawmakers remain divided along party lines about what that reform should look like — and why it’s needed.