The views expressed by contributors are their own and not the view of The Hill

Governing social media: You help to make the rules

Associated Press/Tony Avelar
Facebook’s Meta logo sign is seen at the company headquarters in Menlo Park, Calif. on Oct. 28, 2021.

Who should decide what people can and cannot say online? Ask this question a hundred times and you’re likely to get a hundred different answers. There is no perfect system for moderating content online. But one thing we know for sure at Meta is that these decisions shouldn’t be left up to us alone.

That’s one of the reasons we’ve called on lawmakers to write a new set of long-overdue rules for the internet. Without them, too many decisions are left to private companies, resulting in a patchwork of different policies and processes over online speech, leaving too much power in the hands of too few companies.

In 2018, Mark Zuckerberg shared a blueprint for how to make platforms like ours more accountable for how they are governed. That led us to create the Oversight Board, a body that makes independent, binding judgements on some of our most consequential content decisions and issues policy advisory opinions that we consider and respond to publicly. We created it after consulting more than 200 experts around the world — from Oklahoma to Dubai. This past July, we reaffirmed our commitment to the Oversight Board by approving a new three-year, $150 million commitment to fund its operations through the independent Oversight Board Trust.

We know there is plenty of room to innovate further in this space. That’s why we’re exploring an industry-first experiment in how platforms can be governed. It could have far-reaching implications for who makes the rules online around some of the most difficult questions. It involves giving regular people who use these technologies a seat at the table to weigh in, not just experts. We call it a Community Forum. It brings together regular people, representing a wide array of backgrounds, for structured conversations around a contentious issue before agreeing on a set of non-binding recommendations to inform Meta’s policies. 

The process isn’t new. They’ve been called Citizen Assemblies and Community Deliberations in other settings. They’ve been used by organizations and governments around the world for decades to successfully answer some of the most difficult questions, from amending the constitution in Ireland, to addressing environmental disasters and population pressures in Uganda and changing the election system in parts of Canada.

But using this process to inform content policies for social media has never been done. So to explore how we could use it more effectively, including as a way of gathering feedback for building the metaverse, we piloted three Community Forums in partnership with the Behavioral Insight Team, a UK-based social insights organization — two in the U.S. and one internationally. Around 260 users deliberated on Meta’s approach to problematic climate content in five countries — the U.S., Nigeria, India, Brazil and France.

What we found from these pilots was both surprising and encouraging, especially given how varied people’s perspectives on climate change tend to be. For example, across all three forums, there was widespread agreement that Meta should do more to educate users and encourage a dialogue about important topics such as climate change. Participants were generally opposed to downranking or limiting people’s ability to share certain posts because they believed that it would reduce the discussion about this topic overall.

Forums like these have shown that people with opposing views can come to a meaningful consensus on difficult policy issues. We found consistent results across varied geographies. Facilitators witnessed high quality discussions based on logical reasoning, with some participants even changing their mind on

controversial subjects. And participants felt these forums were a step in the right direction for Meta. We expect this to make us more likely to invest in efforts to inform people about how we approach these issues on our platforms.

Giving people a bigger voice in these decisions is a logical step in the evolution of how we govern our platforms. ​​One example is the future metaverse. As it develops over the next few years, it will not be built or managed by one company, but by creators, developers, companies, and everyday users sharing responsibility for how it is governed.

Empowering people in new ways through a sense of presence and connection is what the metaverse will be all about. I’ve already seen the potential impact in my own family. I’m reminded of my late mother, who had physical disabilities towards the end of her life, but who would now be able to be present and participate in experiences that her disabilities prevented thanks to the VR technology that our Quest headsets provide.

So as we build out a new platform centered around people, it makes even more sense that its rules be informed not just by companies and experts, but also by the very people who use these technologies. Incorporating Community Forums into how our platforms operate today can therefore serve as an important springboard into how they are governed tomorrow.

Brent Harris is vice president of governance at Meta.

Tags First Amendment rights free speech Governance Mark Zuckerberg Meta Metaverse online speech Social media social media platforms social media regulations

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.