Section 230
In the United States, Section 230 is a section of the Communications Act of 1934 that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by their users. At its core, Section 230 provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:
Section 230 further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the voluntary good faith removal or moderation of third-party material the operator "considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."
Section 230 was developed in response to a pair of lawsuits against online discussion platforms in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers, Stratton Oakmont, Inc. v. Prodigy Services Co., or alternatively, as distributors of content created by their users, Cubby, Inc. v. CompuServe Inc. The section's authors, Representatives Christopher Cox and Ron Wyden, believed interactive computer services should be treated as distributors, not liable for the content they distributed, as a means to protect the growing Internet at the time.
Section 230 was enacted as section 509 of the Communications Decency Act of 1996. After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in Reno v. American Civil Liberties Union to be unconstitutional, though Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have validated the constitutionality of Section 230.
Section 230 protections are not limitless and require providers to remove material that violates federal criminal law, intellectual property law, or human trafficking law. In 2018, Section 230 was amended by the Allow States and Victims to Fight Online Sex Trafficking Act to require the removal of material violating federal and state sex trafficking laws. In the following years, protections from Section 230 have come under more scrutiny on issues related to hate speech and ideological biases in relation to the power that technology companies can hold on political discussions and became a major issue during the 2020 United States presidential election, especially with regard to alleged censorship of more conservative viewpoints on social media.
Passed when Internet use was just starting to expand in both breadth of services and range of consumers in the United States, Section 230 has frequently been referred to as a key law, which allowed the Internet to develop.
Application and limits
Section 230 has two primary parts both listed under §230 as the "Good Samaritan" portion of the law. Under section 230, as identified above, an information service provider shall not be treated as a "publisher or speaker" of information from another provider. Section 230 provides immunity from civil liabilities for information service providers that remove or restrict content from their services they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected", as long as they act "in good faith" in this action.In analyzing the availability of the immunity offered by Section 230, courts generally apply a three-prong test. A defendant must satisfy each of the three prongs to gain the benefit of the immunity:
- The defendant must be a "provider or user" of an "interactive computer service".
- The cause of action asserted by the plaintiff must treat the defendant as the "publisher or speaker" of the harmful information at issue.
- The information must be "provided by another information content provider", i.e., the defendant must not be the "information content provider" of the harmful information at issue.
As of mid-2016, courts have issued conflicting decisions regarding the scope of the intellectual property exclusion set forth in §230. For example, in Perfect 10, Inc. v. CCBill, LLC, the 9th Circuit Court of Appeals ruled that the exception for intellectual property law applies only to federal intellectual property claims such as copyright infringement, trademark infringement, and patents, reversing a district court ruling that the exception applies to state-law right of publicity claims. The 9th Circuit's decision in Perfect 10 conflicts with analysis in other courts including Doe v. Friendfinder. The Friendfinder court specifically discussed and rejected the 9th Circuit's reading of "intellectual property law" in CCBill and held that the exception for intellectual property law applies to both federal and state intellectual property claims, including state right of publicity claims.
Two bills passed since the passage of Section 230 have added further limits to its protections. The Digital Millennium Copyright Act in 1998, service providers must comply with additional requirements for copyright infringement to maintain safe harbor protections from liability, as defined in the DMCA's Title II, Online Copyright Infringement Liability Limitation Act. The Stop Enabling Sex Traffickers Act of 2018 eliminated the safe harbor for service providers in relationship to federal and state sex trafficking laws.
Background and passage
Prior to the Internet, case law was clear that a liability line was drawn between publishers of content and distributors of content; a publisher would be expected to have awareness of material it was publishing and thus should be held liable for any illegal content it published, while a distributor would likely not be aware and thus would be immune. This was established in the 1959 case, Smith v. California, where the Supreme Court ruled that putting liability on the provider would have "a collateral effect of inhibiting the freedom of expression, by making the individual the more reluctant to exercise it."In the early 1990s, the Internet became more widely adopted and created means for users to engage in forums and other user-generated content. While this helped to expand the use of the Internet, it also resulted in a number of legal cases putting service providers at fault for the content generated by its users. This concern was raised by legal challenges against CompuServe and Prodigy, which were early service providers at that time. CompuServe stated it would not attempt to regulate what users posted on its services, while Prodigy had employed a team of moderators to validate content. Both companies faced legal challenges related to content posted by their users. In Cubby, Inc. v. CompuServe Inc., CompuServe was found not to be at fault as, by its stance as allowing all content to go unmoderated, it was a distributor and thus not liable for libelous content posted by users. However, in Stratton Oakmont, Inc. v. Prodigy Services Co., the court concluded that because Prodigy had taken an editorial role with regard to customer content, it was a publisher and was legally responsible for libel committed by its customers.
Service providers made their Congresspersons aware of these cases, believing that if followed by other courts across the nation, the cases would stifle the growth of the Internet. United States Representative Christopher Cox had read an article about the two cases and felt the decisions were backwards. "It struck me that if that rule was going to take hold then the internet would become the Wild West and nobody would have any incentive to keep the internet civil," Cox stated.
At the time, Congress was preparing the Communications Decency Act, part of the omnibus Telecommunications Act of 1996, which was designed to make knowingly sending indecent or obscene material to minors a criminal offense. A version of the CDA had passed through the Senate pushed by Senator J. James Exon. People in a grassroots effort in the tech industry reacted to try to convince the House of Representatives to challenge Exon's bill. Based on the Stratton Oakmont decision, Congress recognized that requiring service providers to block indecent content would make them be treated as publishers in the context of the First Amendment, and thus would make them become liable for other content such as libel, not set out in the existing CDA. Cox and fellow Representative Ron Wyden wrote the House bill's section 509, titled the Internet Freedom and Family Empowerment Act, designed to override the decision from Stratton Oakmont, so that a service provider could moderate content as necessary and would not have to act as a wholly neutral conduit. The new provision was added to the text of the proposed statute while the CDA was in conference within the House.
The overall Telecommunications Act, with both Exon's CDA and Cox/Wyden's provision, passed both Houses by near-unanimous votes and was signed into law by President Bill Clinton by February 1996. Cox/Wyden's section became Section 509 of the Telecommunications Act of 1996 and became law as a new Section 230 of the Communications Act of 1934. The anti-indecency portion of the CDA was immediately challenged on passage, resulting in the Supreme Court 1997 case, Reno v. American Civil Liberties Union, that ruled all of the anti-indecency sections of the CDA were unconstitutional, but left Section 230, among other provisions of the Act, as law.