Why Critics Should Leave Section 230's Liability Protections Alone

By Jessica Melugin & John Berlau
December 22, 2020

Attacks on Section 230, the liability limiting shield for online platforms, are coming from President TrumpRepublicans and Democrats on Capitol Hill. But the debate around curtailing or repealing the law is full of misunderstandings about how it functions and the consequences of changing it. To understand Section 230 better, it helps to think of its liability protections encouraging speech online similarly to how incorporating provides liability protections that encourage entrepreneurship. 

The liability limiting benefits of incorporation are widely understood and accepted as beneficial. When entrepreneurs incorporate their business, they protect their personal property from potential litigation and risk. This doesn’t mean there’s no liability for the corporation, but it does eliminate significant and excessive litigation. This limiting of liability tips the scales in favor of continued entrepreneurship and increased commerce.

Section 230 is similar in that it encourages maximum speech online by ensuring that platforms won’t be held liable for what others post on that site. It acts as a filter that keeps platforms from being hauled into court every time someone objects to a post by a third party.

For example, let’s say ‘Every1saCritic’ posts a negative review on Yelp about the food at his local pizza parlor being served cold. But the owner of the pizza place begs to differ; he contacts Yelp and threatens to sue the platform if they don’t take down that review. Obviously, Yelp does not want to spend the time or money to litigate the validity of the coldness claim — nor any like it among their 214 million online reviews as of Q2 2020.

It’s Section 230’s liability protections that keep closed the flood gates of those cases that could cripple bigger platforms and prove lethal for nascent up-and-comers. 

It accomplishes this by placing the liability on the speaker of the post, not the host. Section 230 reads: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” That liability arrangement remains true even if the platform acts to curate third party content by removing or flagging it. In fact, the law was created to incentivize online platforms to be proactive in setting their own standards for content and enforcing those standards free from worry of that moderating would trigger the liability obligations traditionally applied to publishers or distributors.

Section 230 was passed as part of the 1996 Communications Decency Act in response to a earlier New York state court ruling from a year prior, Stratton Oakmont v. ProdigyIt held that a service provider could be held liable for false information its users posted when that provider utilized content moderating tools. (As an aside, the victorious plaintiff in the case, the principals of the Stratton Oakmont brokerage firm who were suing the Prodigy platform for alleged fraud over a third-party post, would indeed be convicted of securities fraud and later achieve infamy in the film “The Wolf of Wall Street.”) Section 230 was passed so that information about issues of vital public interest, such as securities fraud, could flow freely without platforms fearing they’d be held liable for the content of those third-party posts. Meaning, it’s safe for user posts to stay up even if some think that content is not true or offensive. This leads to more speech online, not less, as some Section 230 critics on the right wrongly assert.

Because Section 230 standardized the rules and prevented cases that would have otherwise set precedent under common law and First Amendment considerations, the case law for liability of platforms for third party content is sparse. It’s uncertain what liability for platforms would look like without Section’s 230 clarifications. But what’s more certain is that the costs and risks of litigating over third party content that might be found to be liable could be prohibitive for small startups and chill important speech.

The genius of Section 230 is that it settles liability questions before the costs of litigating and gives platform operators the same sort of assurance that state incorporation statutes give to entrepreneurs in limiting personal liability for corporate actions. A repeal of Section 230 would likely create a harmful, overly-litigious environment for online platforms in the same way that removing the protections of incorporation would be damaging for entrepreneurs.  Lawmakers on both sides of the aisle should think twice before repealing Section 230.

Jessica Melugin is Director of the Center for Technology and Innovation at the Competitive Enterprise Institute. John Berlau is a senior fellow at the Competitive Enterprise and author of George Washington Entrepreneur: How Our Founding Father’s Private Business Pursuits Changed America and the World.

View Comments

you might also like
Either the US Leads on Crypto, or China Will
Jessica Melugin, John Berlau
Even casual viewers of cable news are familiar with commercials featuring actor William Devane – usually golfing or horseback riding...
Popular In the Community
Load more...