Facebook should do more on voter suppression, hateful content, civil rights audit says

Greg Nash

Facebook on Sunday released the latest findings from its ongoing civil rights audit, including a series of pledges to better deal with harassment against activists and people of color, as well as how it plans to contain political misinformation ahead of the 2020 presidential election and census.

Facebook over the past year has been undergoing a civil rights audit led by Laura Murphy, a top civil rights attorney. Since last year, Murphy has conferred with more than 90 civil rights organizations about their concerns over how the platform can be abused to discriminate against minorities and other marginalized groups.

{mosads}Sunday’s report marked the second installment of the audit, and includes insight into the pilot programs and policy changes Facebook is considering in order to address the struggles raised by people of color on the platform.

To emphasize that such work is ongoing, Facebook Chief Operating Officer Sheryl Sandberg wrote in a blog post that the tech giant will launch a formalized civil rights task force that will exist even after the audit ends later this year.

“Perhaps most importantly, today we’re announcing plans to build greater awareness about civil rights on Facebook and long-term accountability across the company,” Sandberg wrote. “Since the first audit update in December, I created a civil rights task force made up of senior leaders across key areas of the company. Today, we’re going one step further and formalizing this task force so it lives on after the audit is finished.”

The task force, which will include top executives from across the company, will listen to and seek to address civil rights concerns from outside groups and employees. Facebook has also agreed to institute civil rights training for some employees and bring on outside experts in issues like voter suppression.

Muslim Advocates, one of the organizations that has been instrumental in pushing Facebook toward a civil rights advocate, said in a statement that the task force “will not result in meaningful change.” 

“Facebook’s announcement that it will convert an ad hoc, interdepartmental collaboration of current staff tasked with addressing civil rights concerns into a permanent configuration will not result in meaningful change,” Muslim Advocates wrote. “It is clear that Facebook’s leadership continues to fail on this front.”

The group is calling for CEO Mark Zuckerberg and Sandberg to step down from the board, to increase diversity on the board and for a top-level civil rights expert and ombudsman.

Murphy in her report said she believes Facebook is taking civil rights issues more seriously than ever before, particularly following reports that Russian trolls used the platform to intensify and exacerbate racial divisions in the U.S. in 2016. Members of Russia’s Internet Research Agency used Facebook and its image-sharing platform Instagram to discourage black voters from going to the polls, to spread misinformation around voting and to amplify racially divisive content.

{mossecondads}Now, according to Murphy’s report, Facebook is testing a host of policy changes that could address some of Facebook’s issues with hate speech, harassment and voter suppression, as well as ongoing complaints from civil rights activists that their posts are taken down even when they condemn bigoted content.

The company announced it is dedicating significant resources to staving off misinformation around the census, which some have said could lead to a drastic undercount of minorities if they feel threatened or are led astray by bad actors on Facebook. The company will unveil a policy change in the fall banning census interference overall. 

And when it comes to hateful and bigoted content, Facebook is piloting a program that would assign some content moderators to hate speech only, meaning they could focus more attention and get more training on the issue of what defines “hate speech” specifically.

Those reviewers would be asked to assess the context of posts more closely, as the auditors have found that Facebook sometimes takes down content from activists because they fail to assess the broader context of the post, for instance, when a black activist posts an image of a Nazi protest with a caption educating users about bigotry and hatred.

Facebook has committed to updating its content moderation tools to better emphasize context, such as comments or captions.

Murphy’s audit team is also asking Facebook to expand its hate speech policy to include “continents and regions larger than a single country,” such as “the Middle East” or “Latin America.” The auditors have found that Facebook’s policies only bar hate speech on the basis of “national origin,” but harassers often target people’s regional origin as well.

The next phase of the audit is expected to address more specifically harassment and content moderation appeals and penalty systems, according to Murphy.

Facebook has made a series of significant changes over the past six months to better deal with hatred and discrimination on its platform. In a settlement with a coalition of civil rights organizations, Facebook has agreed to limit advertisers’ ability to target ads about housing, employment and credit to certain communities, after activists raised concerns that minority groups were being denied important opportunities through Facebook’s targeted advertising practices.

And the company earlier this year agreed to ban posts promoting “white nationalism” and “white supremacy,” after not having done so previously

Progress has been slow and civil rights groups have become agitated, pushing Facebook to take more substantive steps rather than promising policy changes somewhere down the line.

“The thing I wanna see from the audit is mostly actions from Facebook,” Carolyn Wysinger, a member of  Color of Change, told The Hill in a phone interview on Friday.

Change the Terms, a coalition of civil rights groups that Color of Change is a part of, was at the forefront of the push to make Facebook take civil rights concerns more seriously.

Wysinger said she recently sat down with Facebook’s public policy director Neil Potts to express her concerns particularly around content moderation. Facebook has taken down multiple posts by Wysinger in which she expressed concerns about racial discrimination in the U.S.

She said she hopes content moderators will receive more training and will become full-time Facebook employees. Right now, the bulk of Facebook’s thousands of content reviewers are contractors hired by outside companies.

Color of Change, a digital civil rights group, has been pressing Facebook over the struggles faced by people of color on the platform since 2015. In 2018, the company agreed to a civil rights audit under enormous pressure from Color of Change and other groups.

And in November, the company agreed to release publicly future findings from the audit, after it was discovered that Facebook had hired a public relations firm — Definers — that sought to smear groups like Color of Change and other top Facebook critics.

“We know these are the first steps to developing long-term accountability,” Sandberg wrote in her blog post. “We plan on making further changes to build a culture that explicitly protects and promotes civil rights on Facebook.”

Tags Facebook Mark Zuckerberg tech

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.