Is Facebook Awash in Hate?

By Thomas Lenard
August 20, 2020

The authors of Facebook’s recent Civil Rights Audit seem to believe hate speech and voter suppression are major problems on the platform. That may be true, but neither the 89-page report nor the 2018 interim report even try to measure the existence or magnitude of the problem. Instead, the report is mainly an audit of Facebook’s progress — or, more accurately, lack of progress, in the auditors’ view — in implementing policies promoted by the civil rights community. The issues raised by the civil rights groups identified in the report are serious, but the report doesn’t tell us much about what is actually happening on the platform or the effectiveness of measures Facebook has already implemented.

Of all the questions related to hate speech and voter suppression on the Facebook platform, the auditors chose to highlight Facebook’s approach to three of President Trump’s recent posts. Two of those posts concerned mail-in voting, which the auditors considered factually inaccurate and examples of voter suppression. The third, in connection with recent demonstrations, was cited as an example of hate speech that incites violence.

The auditors are very critical of Facebook’s policy of not fact-checking politicians (with some exceptions, such as Covid-related misinformation) and of Facebook CEO Mark Zuckerberg’s “prioritization of a definition of free expression as a governing principle of the platform.” They consider Facebook’s failure to remove the Trump posts “significant setbacks for civil rights.”

How platforms handle posts from politicians is important, but in focusing so much attention on the Trump posts, the Facebook Audit missed an opportunity to conduct a real audit of hate speech and voter suppression on the platform. Auditing (and discussing) these issues meaningfully requires defining the problems in a measurable way. The report does not define hate speech or voter suppression or contain any data on their prevalence. Without definitions, it is difficult to determine the magnitude of the problems and any progress in ameliorating those problems. Further, without definitions it is also difficult to apply consistent standards to politicians’ posts.

The lack of definitions in the report likely stems from the underlying problem that precisely defining hate speech and voter suppression is difficult. Extreme cases are easy to classify. The difficulty is where to draw the line. For example, what sorts of discussion of the pros and cons of different voting procedures, such as mail-in balloting, would be acceptable? Proposing definitions might generate the same controversies and criticisms for the authors that Facebook itself faces, but by suggesting some definitions the authors could have moved the debate forward significantly.

Tackling a problem without defining and measuring it is unlikely to address the real issues. Facebook must already have an internal definition of hate speech, since it claims that only a tiny fraction of the billions of posts on its platform every day is hateful and that it removes 89% of hate speech before people even see it, up from 65% in 2019 (one of the few statistics the audit cites).

Facebook has committed to working with others in the advertising industry to arrive at a common definition of hate speech. Facebook says it intends to use the common definition as the basis to measure the prevalence of hate speech in its regular transparency reports so that we can know how much of the universe of Facebook posts is accounted for by hate speech.

If Facebook follows through on the plan it has outlined, we will have more data to evaluate its claim.  

Facebook should similarly work, perhaps with outside groups, to define voter suppression. The audit suggests voter suppression is rampant, but, again, does not define it. Without a definition it is difficult to know how prevalent voter suppression truly is and how effective any measures to address it are. 

All this is not to suggest that the audit does not have value. It does, if only because it is pushing Facebook to gather, and publish, better data on what actually happens on its platform. That is not a small accomplishment. 

Thomas Lenard is a senior fellow and president emeritus at the Technology Policy Institute.

View Comments

you might also like
COVID-19 Killed the 'Techlash'
Thomas Lenard
For months we have been told that Americans are fed up with technology companies and demanding a change. Big tech is destroying startups,...
Popular In the Community
Load more...