To Avoid Regulatory Snags, Facebook Needs to Empower Its Own Oversight Board

To Avoid Regulatory Snags, Facebook Needs to Empower Its Own Oversight Board
AP Photo/Ben Margot, File
Story Stream
recent articles

Last week, social media executives appeared before Congress, answering for their role in the U.S. Capitol riots which occurred in January, and more widely in spreading extremism and disinformation on their platforms.

Both issues have become increasingly difficult to ignore. On top of the January 6th insurrection, the rise of malicious conspiracies and rampant misinformation surrounding COVID-19 have all provided the impetus for Facebook to change. In a press release, the House Committee on Energy and Commerce stated that “Industry self-regulation has failed.” Despite growing regulatory scrutiny around the world, heavy-handed government regulation doesn’t have to be the answer.

A bipartisan appetite for holding tech companies like Facebook accountable for extremist content published on their platforms has taken over. Accountability is no longer optional. Facebook’s Oversight Board could offer a partial solution to a complex problem, but only if the social media giant stops clipping its wings.

The body is tasked with making content moderation decisions for Facebook’s 2.8 billion users by promoting free expression and making “principled, independent decisions.” The decisions made by the diverse group of 19 civic leaders and experts (including journalists, professors, and a Nobel Peace Prize Laureate) are binding on Facebook, but for now, that is the extent to which their limited power lies. Limiting the Oversight Board’s scope is leaving Facebook vulnerable to more severe regulation in the future.

In its current state, the impact the board has on you as a Facebook user is minimal. So far, five case decisions have been made. But what do they actually mean?

When appeals opened in October, the Oversight Board received 20,000 of them. Of course, it is unrealistic to expect every single one to be heard. Facebook might even be forgiven for the small number of cases it deals with if the cases set a precedent that would affect other posts. They claimed that cases with "the potential to affect lots of users around the world" would be prioritised. However, given the Board’s current remit, it’s a misleading claim. Decisions made about a single post can only be applied to other posts which have identical content with parallel context. Facebook has no obligation to take down those similar posts, so the “binding” nature of the Board’s decisions doesn’t mean much. It’s easy to see why so many have been quick to judge the Oversight Board as a sham.

The board can’t decide to remove controversial posts that Facebook has kept on the site either. The body which set out to tackle extremism on the social media site can only fight half of the problem. Even board members have found themselves frustrated by the seemingly binary nature of their role. It must be able to look at both sides of the coin if it is to avoid governments stepping in to take its place.

While yes or no decisions made by the Board don’t matter much, the policy decisions that come with them should. But Facebook is under no obligation to enforce them. Out of the first 5 case decisions, only 11 actions were taken. Many of the 11 were likely already in the pipeline or too vague. Facebook has proven that the only real power it has given the board is binary decision-making for a finite number of posts. It’s not good enough for Facebook’s users, nor is it good enough for the governments eager to have greater control over the social media site. Committing more strongly to the Board’s policy recommendations would give it greater legitimacy in the eyes of users.

The Board isn’t past saving. It offers an institutionally innovative approach to a problem as old as social media itself.

An expansion of regulatory powers for the Board shouldn’t be a cause for concern either. Board members have proven to be vigilant towards free speech issues. Out of five cases where a user’s post was removed, including Nazi propaganda and nudity, the Board overturned Facebook’s decisions. It is disappointing to see that those rulings have failed to translate to outcomes that positively impact the wider Facebook and Instagram communities.

Facebook hasn’t done enough to show that it is taking reform seriously, and until it does, the incessant probing from Congress won’t cease. Expanding the mandate of the Board would allow Facebook to continue its self-governance experiment while implementing meaningful changes to the social media site without damaging the ability of users to express themselves. That’d be good news for us all.

Alexandra Harrison is a Young Voices UK contributor.

Show comments Hide Comments