How Many 'Oversight Boards' Does It Take to Dim a Light Bulb?
Very soon, Facebook’s much-anticipated “Oversight Board” will be open for business. The 20-member board, funded by Facebook, but composed of “outside experts and civic leaders,” says it will “[e]nsur[e] respect for free expression, through independent judgment.”
The Board’s rulings will have the power to overturn Facebook and Instagram’s moderation decisions and will be binding as to the content at issue. Rulings must be implemented “unless doing so could violate the law.” As neither the Board’s site header nor URL mention either Facebook or Instagram, one wonders whether it might someday oversee all content curation. But in the meantime, Facebook seems to hope that adding another layer of moderation — a board of authoritative supermoderators — will quash cause concerns about its biased content curation practices.
Enter the Board’s arch-rival, The Real Facebook Oversight Board, which includes Facebook’s former head of election integrity ops for political ads, along with leaders of #StopHateForProfit. Among the 25 authorities in this group are “experts from academia, civil rights, politics and journalism.” Why another board? Facebook apparently thinks only one is needed: it recently forced the ISP of realfacebookoversight.org to take down the group’s site for alleged phishing. (The URL now redirects here.) Facebook’s Board is permitted up to 90 days to make decisions. And Carole Cadwalladr, the founder of the group behind the rival board, emphasized the need, during the elections, for “a real-time response from an authoritative group of experts to counter the spin Facebook is putting out."
Surely two dueling boards will finally solve the problem of biased Facebook content moderation practices? Well, that depends. Some will see the impressive credentials of the boards’ members and rest easy. After all, if the goal is to preserve, by making seem “fair,” the practice of content curation informed by normative standards, then authoritative oversight is arguably adequate.
But let’s dig deeper. Even granting the board members’ impressive credentials, few of us know any of them personally, and so we can’t know whether they are able — or willing — to set aside biases when evaluating content. Remember, the Community Standards they’ll be applying —including Facebook’s prohibition of “hate speech” — are laden with normative conclusions. What if you disagree with one of those conclusions, and that’s evident in the content you’ve posted? Perhaps the best you can hope for, if you appeal, is that some members of at least one of the boards share your views. But what if none do?
“[I] think that there are mysteries in the sky and under the water and in the plants which grow. But the Council of Scholars has said that there are no mysteries, and the Council of Scholars knows all things.”
This, from Ayn Rand’s dystopian novella, Anthem, is what awaits us if we continue down this path, trying to address the problems inherent in normatively guided content curation, merely by adding another, more authoritative layer of review. How many degrees of separation are there between an “Oversight Board” and either Rand’s “Council of Scholars” or Orwell’s “Ministry of Truth”? Oversight is just another flavor of moderation. The real solution is to eliminate viewpoint-based content moderation and allow individuals to think for themselves.
Layers of oversight cannot replace independent thought and judgment. Facts about whether hydroxychloroquine is an effective treatment for COVID-19, don’t care whether Facebook —or its overseers — agree. Similarly for whether COVID lockdowns, or net-zero carbon emissions, are good policy. Or whether a trove of emails really is evidence of pay-to-play. Reaching the truth regarding these and other issues is vital for our survival and, as history tells us, truth is often arrived at only after extended, spirited debate among people of opposing viewpoints.
No amount of “oversight” can guarantee truth and — even if it could — no person or group of persons can impose understanding. What they can do, unfortunately, is hinder individual thought, by preventing people from acting on their own judgment —even when that action is mere expression. Why bother struggling to discover and communicate what is true, if there remains no avenue for communication of that truth? Our founders understood this. The First Amendment to our Constitution was the result.
At Parler we don’t knowingly allow our platform to be used as a tool for crime, civil torts, or other unlawful acts. We believe the best answer to non-criminal hate speech (to take one example) is more speech. Our minimalist guidelines are enforced by a quorum-based community jury system. While no human being is omniscient or infallible, and so no system is free of errors, we believe our system affords the widest possible latitude for expression. For thought. For error correction in pursuit of truth and values.
When we see millions of light bulbs, we at Parler don’t devise new, more palatable ways to dim them. We look for ways to allow even more of them to shine more brightly.
Amy Peikoff is chief policy officer at Parler.