By: Katie Stoughton
January 22, 2023
“Hope u die,” was one of the tamer death threats aimed at Anita Sarkeesian in the wake of the online, targeted harassment campaign against her and multiple women in the gaming industry called #Gamergate. This explicitly misogynistic and anti-feminist campaign of abuse was so severe that it led multiple women to withdraw from the gaming industry and caused the FBI to investigate threats lodged against these women. Sarkeesian’s experiences with online abuse were extreme, but not unique. Almost anyone that spends time online has witnessed or been subject to harmful content online. In the U.S., approximately 41% of people have experienced online harassment. Harmful content online has been linked to a myriad of offline harms, including, but not limited to, white nationalist terrorism; for example, fervor for the Capitol Insurrection was stirred up online long before the events of January 6th. Online abuse is so rampant that it is expected, but how did we get here? While some point to anonymity or mob culture online, perhaps the strongest contender is Section 230 of the Communications Decency Act, otherwise known as “the twenty-six words that created the internet.”
It is undeniable that Section 230 is one of the most important pieces of legislation in creating the modern internet. The internet, and particularly internet platforms, host a great amount of third-party content, and some of this content may be harmful and result in legal consequences. Section 230 arose from the strange Catch-22 situation that internet companies found themselves in regarding the hosting of third-party content after CompuServe and Prodigy.
While liability is clear for the first-person speakers of tortious content, the liability connection is more complicated for the companies that host or publish third-party content. Traditionally, courts draw a line between distributors and publishers. Publishers can be held liable for third-party content they publish because they took part in the publishing and are expected to publish responsibly. Distributors are only liable if they knew or had reason to know that the content they were distributing was tortious. This precedent led directly to CompuServe and Prodigy. Both cases centered on defamatory third-party content users posted on websites. In Prodigy, Prodigy took steps to moderate content on its platform; thus, the Court determined that Prodigy was acting like a publisher and must be liable for the content published on its site. In CompuServe, CompuServe took no steps to moderate the content they hosted; therefore, the Courts found them to be a distributor and not liable for the content they hosted.
Effectively, these decisions encouraged companies to ignore the proliferation of harmful content and impaired companies trying to moderate harmful content. To rectify this situation, Congress passed Section 230(c), titled “Protection for ‘Good Samaritan’ blocking and screening of offensive material”, which imbued internet companies with protection from liability from third-party content.
This was a major victory for the internet. Many of the most important internet platforms thrive on third-party content, and the cost of litigation and moderation could have killed many internet companies in their infancy. We would not have the modern internet without Section 230. Defenders of Section 230 seem to contend that because of this fact, Section 230 should be untouchable. But just because something is important does not make it untouchable. Section 230(c) is the “Good Samaritan” clause. Yet, while it endows internet companies with all the benefits of “Good Samaritan” protections, it does not require internet companies to act as a Good Samaritan. With all the benefits and none of the responsibilities of a Good Samaritan, companies can have their cake and eat it too. While some platforms try to moderate harmful content online, they lack true incentives to combat it strongly. As our lives increasingly shift online, it is timelier than ever to amend Section 230. If companies want the benefits of being treated like a Good Samaritan, then we must make them act like one.
 See Eliana Dockterman, What Is #GamerGate and Why Are Women Being Threatened About Video Games?, TIME (Oct. 17, 2014),https://time.com/3510381/gamergate-faq/ (explaining the history of Gamergate).
 See id. (providing context for Gamergate and the damage it caused); see also Caitlin Dewey, The Only Guide to Gamergate You Will Ever Need to Read, Wash. Post (Oct. 14, 2014), https://www.washingtonpost.com/news/the-intersect/wp/2014/10/14/the-only-guide-to-gamergate-you-will-ever-need-to-read/ (giving examples of the type of abusive content online during Gamergate).
 Emily A. Vogels, The State of Online Harassment, Pew Rsch. Ctr. (Jan. 13, 2021), https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment/.
 See Daniel Karrell, Online Extremism and Offline Harm, items (June 1, 2021), https://items.ssrc.org/extremism-online/online-extremism-and-offline-harm/ (explaining that harmful online content is directly related to offline harms, like the January 6th Insurrection).
 See generally Jeff Kosseff, The Twenty-Six Words that Created the Internet 2, 4, 8 (2019) (explaining the importance of Section 230 in the development of the internet).
 See generally Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135, 141 (S.D.N.Y. 1991) (holding that a company that does not moderate is a distributor); Stratton Oakmont Inc. v. Prodigy Servs. Co., No. 31063/94, 1995 WL 323710, at *4 (N.Y. Sup. Ct. May 24, 1995) (holding that a company that does moderate is a publisher).
 See Prodigy Servs. Co., No. 31063/94, 1995 WL 323710, at *3 (holding that a company that does moderate is a publisher).
 See Cubby, 776 F. Supp. at 141 (holding that a company that does not moderate is a distributor).
 47 U.S.C. § 230(c)(2)(A)-(B) (“No provider or user of an interactive computer service shall be held liable on account of— (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).”).
 See Susan Benkelmen, The Law that Made the Internet What it is Today, Wash. Post (April 26, 2019), https://www.washingtonpost.com/outlook/the-law-that-made-the-internet-what-it-is-today/2019/04/26/aa637f9c-57c5-11e9-9136-f8e636f1f6df_story.html (explaining the importance of 230 in the creation of the internet).
 See Jason Kelley, Section 230 is Good, Actually, Elec. Frontier Found. (Dec. 3, 2020), https://www.eff.org/deeplinks/2020/12/section-230-good-actually (defending Section 230 from change).