How Big Tech gets away with censorship: what is § 230 and why was it enacted?

This is the first of three posts examining Section 230 of the Communications Decency Act of 1996, commonly known as § 230. While the law was enacted with the best of intentions to allow the internet to flourish, it has been misconstrued by courts for decades and is being taken advantage of by big tech companies. This post examines the events that led to § 230 being enacted, the short legislative history, and what the law actually says. The next post will examine how courts have wrongfully interpreted § 230, straying far away from the plain meaning of its unambiguous text. The final post will consider several solutions to fixing § 230, and which might be the most advantageous from a constitutional and policy perspective.

While we are all familiar with the most egregious forms of big tech censorship, some smaller and more recent examples highlight the current absurdity. For the past several weeks, Facebook has blocked users from sharing a New York Post story that details how the co-founder of Black Lives Matter spent millions of dollars to buy several homes. The twitter account of the satirical website the Babylon Bee remains in suspension for posting that Rachel Levine, the U.S. Assistant Secretary for Health for the U.S. Department of Health and Human Services, was its “man of the year” winner. And YouTube recently removed videos from the most recent Conservative Political Action Conference (CPAC) because speakers discussed irregularities in the 2020 election.

Over the past two decades, big tech companies like Facebook, YouTube, Twitter, and others have played a consequential role in the dissemination of news, ideas, and viewpoints across the world. While these companies can do enormous good in connecting people in ways never thought possible, they also have the potential to do enormous harm in silencing or removing speech they disagree with. To make matters worse, these companies rarely apply their policies in an objective and evenhanded way–leaving users at a loss for what got them banned in the first place.

Summary

Online platforms (big tech companies that host third-party speech) faced judicial uncertainty for liability from third-party speech before Congress codified 47 U.S.C. § 230. Because the internet was new, courts struggled to place online platforms into traditional categories of liability for operators of communications systems.

  • Under traditional principles of American law, liability for operators of communications systems from third-party speech was divided into one of three categories: publisher, distributor, and platform.
  • Publishers (newspapers) are strictly liable (liable in the same way the speaker would be). Distributors (bookstores) are liable only if they knew or had reason to know of the problematic content of the speech. Platforms (telephone companies) are not liable.
  • Two judicial decisions informed Congress in codifying § 230. In Cubby, the court held that CompuServe was liable as a distributor for third-party speech on its online bulletin board because it exercised no editorial control over what users posted. But in Stratton Oakmont, a different court held that Prodigy was liable as a publisher for third-party speech on its online bulletin board because it exercised editorial control in several different ways–such as employing a software program that pre-screened posts before they were posted.
  • As a result of Cubby and Stratton Oakmont (sometimes referred to as Prodigy), online platforms faced a Hobson’s choice—keep a close watch over what users posted, remove content that goes against the terms of service, and be liable as a publisher; or keep an arm’s-length distance, allow anything to be posted, and be liable as a distributor.
  • The short legislative history of § 230 confirms that the lead sponsor was concerned that Stratton Oakmont (sometimes referred to as Prodigy) was wrongly decided, it created bad incentives for online platforms that wanted to pre-screen content, and these platforms needed the ability to remove disgusting or nasty material from their websites without being held liable as a publisher. Members of Congress were also concerned with filthy material on the internet that made it dangerous for kids.
  • A plain reading of the text of § 230 shows that Congress insulated online platforms only from publisher or speaker liability (i.e., removed Stratton Oakmont’s theory of liability) and gave these platforms the ability to remove filthy material from their websites. However, Congress kept Cubby’s distributor theory of liability in place and did not insulate platforms from being held liable for their own speech or product.

Importantly, these companies are not government actors, and therefore are not subject to the First Amendment’s Free Speech Clause. In other words, there is no constitutional right to speak on these platforms. Before Congress enacted 47 U.S.C. § 230, these companies were rightly concerned about certain forms of liability arising from third-party speech on their platforms. For example, companies like America Online (AOL) that hosted bulletin boards and allowed users to post were concerned that a user might defame someone, and they might be held liable for that defamation as a publisher (i.e., the person who was defamed could also sue AOL for defamation for publishing the defamatory post).

Congress codified § 230 to protect AOL from publisher liability in the hypothetical above. The judicial decisions that Congress responded to, the legislative history of the law, and the plain text of the law all make this clear. Congress also wanted AOL to be free to remove filthy or nasty content without being held liable as a publisher.

Different kinds of liability

In the United States, the law has traditionally divided operators of communications systems into one of three buckets. Each comes with its own liability rules for third-party speech.

  • Publisher. Historically, those who published or repeated the material of third parties were liable in the same way that they were liable for their own speech (strict liability). See Restatement (Second) of Torts § 578 (1977). This applied to newspapers, magazines, and broadcasters. For example, in a letter to the editor, a newspaper would be subject to the same standard of liability for defamation as the one who wrote the letter. This harsh standard existed because these entities exercise complete editorial control over their content. If they believe something is defamatory, they have a choice not to publish it.
  • Two Supreme Court cases severely limited the applicability of defamation law that in turn protects publishers from very serious defamation cases. In New York Times v. Sullivan (1964), the Court held that when a public figure brings a defamation claim, he or she must show the defendant acted with actual malice—that the statement was made with knowledge of its falsity or with reckless disregard for its truth. Similarly, in Curtis Publishing Company v. Butts (1967), the Court held that the New York Times standard of actual malice also applied when public figures bring defamation claims. Because it is very hard, if not impossible to prove actual malice, publishers are almost completely shielded from defamation claims involving a public figure or official.
  • Distributor. Those that distributed material of third parties were liable only if they knew or had reason to know of the problem associated with the speech. See Restatement (Second) of Torts § 581 (1977). This is better referred to as a notice-and-take-down standard. Bookstores, libraries, and newsstands that distribute material published by third parties are subject to this standard of liability. Like publishers, distributors have full control over what they decide to distribute; but unlike publishers, distributors are not distributing their own product and are subject to a different standard of liability as a result.
  • Platform. Platforms or conduits are not liable for third-party speech. Typical examples include phone companies, broadcast stations legally required to host candidate ads, and cities or towns that have public demonstrations. See Restatement (Second) of Torts § 612 (1977). The common theme amongst this group is that they are hosting third-party speech. Because they serve as a host and are subject to non-discriminatory rules, they cannot be held liable for the speech of others. They exercise no editorial control over third-party speech, and it would be impossible for them to do so. Moreover, the product they are hosting is not their own.
  • In Lunney v. Prodigy Servs. Co. (N.Y. 1999), the Court of Appeals for New York (the highest state court in New York) extended platform immunity to Prodigy’s email system on the theory that its “role in transmitting e-mail is akin to that of a telephone company, which one neither wants nor expects to superintend the content of its subscribers’ conversations.”

Judicial decisions that informed Congress in codifying § 230

The three buckets of liability for third-party speech make sense as applied to preexisting forums, but how would courts apply them in the early years of the internet? There are two key decisions that directly informed Congress in drafting and passing § 230.

Cubby Inc. v. CompuServe Inc. (S.D.N.Y. 1991)

CompuServe’s website allowed subscribers to access hundreds of special interest forums that comprised interactive bulletin boards divided by topic. One of the forums available was the Journalism Forum that was managed by Cameron Communications, Inc. (CCI), independent of CompuServe, which contracted to manage the contents of the forum in accordance with standards established by CompuServe. One newsletter on the Journalism Forum was Rumorville USA, a daily newsletter that provided reports about broadcast journalism and journalists. While Rumorville had no relationship with CompuServe, it did have a contract with CCI to provide content to the forum and CCI agreed to accept full responsibility for Rumorville’s content. Importantly, CompuServe had no chance to review Rumorville’s posts before they were uploaded onto the Journalism Forum.

Another company, Cubby, developed a competitor product to CompuServe’s forums called Skuttlebut. Cubby alleged that Rumorville published defamatory statements relating to Skuttlebut, and CompuServe carried these statements on its Journalism Forum. Cubby sued CompuServe for defamation, claiming it published the defamatory statements.

Cubby argued that CompuServe served as a publisher for the defamatory statements because the Journalism Forum is CompuServe’s product. In response, CompuServe argued that it could be held liable only as a distributor because it had no editorial control over what Rumorville put on the forum, and CIC managed the forum, not CompuServe. As stated above, the distinction between publisher and distributor is crucial. If CompuServe is a publisher, it is liable in the same way that Rumorville would be, but if CompuServe is a distributor, it is liable for defamation only if it knew or had reason to know of the speech’s defamatory content.

The court held that CompuServe was liable as a distributor. It compared CompuServe’s “computerized database” to being the “functional equivalent of a more traditional news vendor,” and reasoned that CompuServe has no more editorial discretion over Rumorville’s publication than other distributors do over their products. In other words, CompuServe was no different with respect to its forums than a newsstand or library is to newspapers or books it distributes. Moreover, like traditional distributors, it would be impossible for CompuServe to examine each statement on every forum before it is posted. The court also underscored that CIC actually manages the forums, not CompuServe. Because of the lack of editorial discretion over the contents on the forum, CompuServe was held to the distributor form of liability.

Stratton Oakmont, Inc. v. Prodigy Services Company (N.Y. Sup. Ct. 1995)

Prodigy created a very popular online bulletin board called “Money Talk” that had at least two million subscribers. At the time, it was one of, if not the most read financial computer bulletin board in the United States. Members could post statements on the board; and Prodigy contracted with bulletin board leaders, a group that participated in board discussions, to promote the board to increase users. An anonymous user posted on the bulletin board that among other things, Stratton Oakmont and its president Daniel Porush committed criminal and fraudulent acts in connection with the initial public offering of a company.

Prodigy was sued by Stratton Oakmont and Porush for defamation. Stratton’s main argument was that Prodigy held itself out as a family-oriented computer network (i.e., it pre-screened posts to ensure that the bulletin board was a family-friendly environment). Moreover, in several newspaper articles written by Prodigy’s Director of Market Programs and Communications, he said that Prodigy “held itself out as an online service that exercised editorial control over the content of messages posted on its computer bulletin boards…” In addition, Prodigy used a software program to prescreen posts for offensive language.

In response, Prodigy argued that its policies had changed in the 18 months since the quote for the article was provided, and the new policies governed when the alleged defamatory statement was posted. Prodigy likened itself to CompuServe in the Cubby case in that 60,000 messages are posted daily on its bulletin board and reviewing each of them is impossible. Unlike CompuServe that contracted with a third party to completely run the forum, the Prodigy bulletin board leaders did not serve as editors, but they could remove messages that violate Prodigy guidelines.

The court held that Prodigy was liable as a publisher. It first distinguished Prodigy’s product from CompuServe’s in Cubby. Unlike CompuServe, Prodigy held itself out to the public as having full control over its bulletin boards and used a software product to check each post before it was posted. Prodigy’s use of this software showed that it exercised editorial control over what goes on the bulletin board because it frequently removed posts that are not in accordance with its guidelines. The fact that not every post is removed the second it is flagged did not matter. In addition to the software, Prodigy’s use of the bulletin board leaders served to enforce its terms of service. Prodigy made these decisions to enforce its terms of service on the bulletin board. In closing, while the court agreed that most computer bulletin boards should be seen as distributors, Prodigy’s conscious choice to exercise constant editorial control through its software and bulletin board leaders opened itself up to publisher liability.

After Cubby and Stratton Oakmont, companies interested in creating online platforms faced a Hobson’s choice. If they allowed third-party speech on their platform, they could be treated as distributors, but only if they did not exercise editorial control over what was posted. But that means they have no say over what goes on their website, allowing for nasty or filthy material to fill up their bulletin boards. In contrast, if these companies wanted to screen posts to keep their websites clean, they would be treated with the much harsher form of publisher liability because they would be exercising editorial control over content. To make matters worse, there’s a fair amount of gray area between these two cases and it would be easy for a company setting up a website to host third-party speech to fall into a category of liability it did not want to be in.

Congressional debate over § 230

When interpreting an unambiguous statute, courts should never look to legislative history to discern the plain meaning (ordinary reading) of the text. As Justice Antonin Scalia correctly put it, “[t]he greatest defect of legislative history is its illegitimacy. We are governed by laws, not by the intentions of legislators… If one were to search for an interpretive technique that, on the whole, was more likely to confuse than to clarify, one could hardly find a more promising candidate than legislative history.” See Conroy v. Aniskoff (1993) (Scalia., J., concurring).

While the text of § 230 is unambiguous, the short legislative history actually strengthens the case for reading the statute according to its plain meaning. Congress had one thing in mind in codifying § 230— to overturn Stratton Oakmont to allow internet companies to block material that would not be seen as “family friendly” without being held liable as a publisher. One of the things members of Congress were also concerned about was protecting children from pornography on the internet.

In February 1996, Congress passed the Telecommunications Act of 1996. Included in that legislation was an amendment put forward by Representatives Chris Cox (R-CA) and Ron Wyden (D-OR) known as Section 230 of the Communications Decency Act of 1996. This amendment is known today as § 230. Rep. Cox explained that one of the purposes of the amendment was to protect children from material they might see on the internet: “[a]s the parent of two, I want to make sure that my children have access to this future and that I do not have to worry about what they might be running into online. I would like to keep that out of my house and off of my computer.” He also talked generally about Cubby and Stratton Oakmont, calling Stratton Oakmont’s liability reasoning “backward” and further explained that “[w]e want to encourage people like Prodigy…to do everything possible for us, the customer, to help us control, at the portals of our computer, at the front door of our house, what comes in and what our children see. This technology is very quickly becoming available, and in fact every one of us will be able to tailor what we see to our own tastes. We can go much further…than blocking obscenity or indecency, whatever that means in its loose interpretations. We can keep away from our children things not only prohibited by law, but prohibited by parents.”

When explaining the purpose of the amendment, Rep. Cox said that “our amendment will do two basic things: First, it will protect computer Good Samaritans, online service providers, anyone who provides a front end to the Internet, let us say, who takes steps to screen indecency and offensive material for their customers. It will protect them from taking on liability such as occurred in the Prodigy case in New York that they should not face for helping us and for
helping us solve this problem. Second, it will establish as the policy of the United States that we do not wish to have content regulation by the Federal Government of what is on the Internet, that we do not wish to have a Federal Computer Commission with an army of bureaucrats regulating the Internet…In this fashion we can encourage what is right now the most energetic technological revolution that any of us has ever witnessed…We can make sure that it operates more quickly to solve our problem of keeping pornography away from our kids, keeping offensive material away from our kids, and I am very excited about it.”

Rep. Wyden, a co-sponsor of the amendment, spoke next in support. Like Rep. Cox, he focused on his support of ensuring that the internet was a safe place for kids and against the Federal Communications Commission being the entity to regulate it. “We are all against smut and pornography, and, as the parents of two small computer-literate children, my wife and I have seen our kids find their way into these chat rooms that make their middle-aged parents cringe. So let us all stipulate right at the outset the importance of protecting our kids and going to the issue of the best way to do it…Parents can get relief now from the smut on the Internet by making a quick trip to the neighborhood computer store where they can purchase reasonably priced software that blocks out the pornography on the Internet… They (the Senate) seek there to try to put in place the Government rather than the private sector about this task of trying to define indecent communications and protecting our kids. In my view…the approach of the other body, will essentially involve the Federal Government spending vast sums of money trying to define elusive terms that are going to lead to a flood of legal challenges while our kids are unprotected.”

The next several representatives spoke in support of the amendment and also focused on protecting children from nasty material on the internet. Finally, Rep. Bob Goodlatte (R-VA) spoke in support of the bill, referencing the Stratton Oakmont decision in turn with his belief that these companies cannot possibly look through every post that comes onto its platform before it is posted. “There is no way that any of those entities, like Prodigy, can take the responsibility to edit out information that is going to be coming in to them from all manner of sources onto their bulletin board. We are talking about something that is far larger than our daily newspaper. We are talking about something that is going to be thousands of pages of information every day, and to have that imposition imposed on them is wrong. This will cure that problem, and I urge the Members to support the amendment.”

While the legislative history of § 230 is not long, what the amendment’s sponsor said was very important. Rep. Cox focused on allowing internet platforms to police their websites in a way to protect children without the fear of being labeled a publisher. Moreover, he specifically referenced the Stratton Oakmont decision which he called “backward” and went on to say that the amendment would ensure that companies like Prodigy would not face publisher liability when exercising editorial control over what is posted on its website. Other members also focused on the importance of allowing internet companies to control what is put on their websites.

The Communications Decency Act of 1996 had several other provisions to protect children from being sent pornography over the internet. One provision prohibited the “knowing” transmission of “obscene or indecent” messages to any recipient under 18 years of age. Another provision prohibited the “knowing” sending or displaying to a person under 18 of any message “that, in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs.” The Supreme Court would hold both these provisions unconstitutional under the Free Speech Clause of the First Amendment in Reno v ACLU (1997).

Relevant text of 47 U.S.C. § 230

(c) Protection for “Good Samaritan” blocking and screening of offensive material

(1) Treatment of publisher or speaker

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability

No provider or user of an interactive computer service shall be held liable on account of – (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected…

(f) Definitions

As used in this section:

(2) Interactive computer service (online platforms) – [t]he term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

(3) Information content provider (users that post on online platforms) – [t]he term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.

Analysis of the text of § 230

Under a straightforward reading of the text, the reader can discern the following:

  • Under (c)(1), Congress repealed the Stratton Oakmont publisher theory of liability. No online platform can be held liable for third-party speech under any claim of speaker or publisher liability. But the companies can still be held liable for their own speech or product because their immunity protections apply only to the content of “another information content provider.”
  • Regardless of whether online platforms decide to strictly regulate or not regulate third-party speech, they do not lose their speaker or publisher immunity.
  • While (c)(1) repealed Stratton Oakmont’s publisher theory of liability, it did not repeal the distributor theory of liability set forth in Cubby. The legislative history further strengthens this argument because no member of Congress questioned Cubby’s holding.
  • Unlike the complete immunity offered in (c)(2), “[n]o provider or user of an interactive computer service shall be held liable…”, (c)(1) insulates online platforms from lawsuits only where they would be held liable as a speaker or publisher. It does not insulate them from any other type of lawsuit. Congress would not have used such specific language in the proceeding subsection otherwise.
  • Under (c)(2), online platforms cannot be held liable for removing nasty or filthy material from their platforms if the action they take is voluntary, in good faith, and restricts access to material. This is directly in line with what members of Congress wanted in making sure the internet was a safe environment for kids.
  • While (c)(2) provides several reasons for why online platforms can restrict access to third-party material (in line with the bullet point above), “otherwise objectionable” must be read in the context of the preceding adjectives. It cannot be read as a catch-all for whatever the platform wants. The legislative history and the statutory canon of ejusdem generis makes this clear. Under the ejusdem generis canon, where general words follow specific words in a statutory enumeration, the general words must be construed to embrace only objects similar in nature to those enumerated by the preceding specific words.
  • If an online platform removes speech for other reasons than what (c)(2) provides for, there is no cause of action (lawsuit) that a user can bring for monetary damages or to get their account or post restored.