The Challenger

Web Hosts Need To Remain Exempt From Libel Laws But Only If They Manage Sites With Content Neutrality

Arts & Entertainment Business Living Other News
RCBJ-Audible (Listen For Free)
Voiced by Amazon Polly

LEGAL ANALYSIS

Columnist Michael Starvaggi, A Local Attorney, Ponders The Defamation Realm On Websites In Light Of Recent Rockland County Cases

By Michael Starvaggi

Recent articles in the Rockland County Business Journal have focused on defamation in the digital age. We have seen cautionary tales about the identity of a presumptively anonymous user being uncovered for purposes of a defamation lawsuit, as well as the diminishment of protection from online libel for those who are deemed to be public figures.

But what about the website or social media platform where the defamatory words are posted?  Does the host share responsibility, and face liability, for potentially defamatory statements posted by its users?

michael starvaggiPrior to the passage of Section 230 of the Communications Decency Act in 1996, the courts followed the basic rule of law that the author as well as the publisher of defamatory material are both liable to the injured party.  This meant that any Web or social media host that exercised editorial control over content submitted by its users was vulnerable to legal action, whereas hosts that took a completely hands-off approach to user content were exempt from liability.

This led to the seemingly inequitable result that those hosts who put in a measure of effort to keep problematic content off of their sites were, by that very action, putting themselves at risk of liability.

Section 230 was passed to remedy this inequity.  It states that an online content provider is not to be treated as the publisher of material provided by third parties on its site even if that provider exercises traditional editorial functions over user-submitted content such as removing or editing material.

As a result, online hosts are now exempt from liability as the publisher of defamatory content posted by their users.  Simple enough.

However, Section 230 goes on to say that online hosts are also exempt from civil liability based on “any action voluntarily taken in good faith to restrict access to or availability of material that the [host] considered to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected” [emphasis added].  So, in addition to being protected for passively allowing user content to remain on their sites, hosts are also immune from liability for actively removing any content they find objectionable under their own subjective standards.

This is the legal foundation of social media censorship — and it must be changed.

Online hosts must either sacrifice the ability to weed out content based on their own proclivities or surrender their protection from civil liability – but they cannot continue to have the power of legalized censorship.

There is a simple and logical approach to reforming Section 230 that should be given consideration.  It follows the basic tenet of First Amendment law known as content neutrality.  That concept allows reasonable restrictions on speech only when the restrictions depend not on the content of the speech, but only on the time, place, and manner in which it is being disseminated.  So, a state may ban the sale or distribution of all merchandise, including printed material, in parks (Heffron v. International Society for Krishna Consciousness,  452 U.S. 640 (1981)) but cannot prohibit the display of signs critical of foreign governments outside of embassies, Boos v. Barry 485 U.S. 312 (1988) because the former restriction does not discriminate based on the content of the message, while the latter does.  In essence, the content neutrality rule only allows restrictions on speech if there is no discrimination as to the point of view of the speaker.

This concept should be introduced into the language of Section 230 by a simple modification stating that any deletion or other restriction of material that is made based on the point of view of the speaker is deemed to not be done in good faith and therefore would expose the host to liability.

It can be argued that a private company should be able to restrict any speech it wishes and promote speech as it sees fit.  That freedom can and should abide as to any content the host decides to publish on its own behalf.

However, the idea behind Section 230 is that a forum that allows its users, as outside parties, to publish their own thoughts should not be punished for what its users have to say. That protection is only justified when the forum is a truly open one that gives equal say to everyone regardless of their point of view. There is no valid rationale for protecting an online host for user content that is manipulated and subject to content-related censorship.  Thus, Section 230 should only grant protection from liability to hosts that do not discriminate based on the viewpoint of its users.

Making that simple change to Section 230 change would go a long way to reinstating truly free discourse in this country.

Because the First and Fourteenth Amendments only offer protection from infringement of free speech by government actors, the law that has developed regarding content neutrality is not currently applicable to the private sector.

Michael Starvaggi is a Nyack-based attorney. mstarvaggi@starlawpc.com