The Supreme Court and the Future of Internet Law

Image source: Klemchuk

Since its enactment in 1996, the Supreme Court has never ruled on a case involving Section 230 of the Communications Decency Act, and rarely has the court ruled on a significant internet-related free speech case. In fact, the Justices themselves, with the exception of Justice Thomas, have been notably quiet regarding the scope of immunity the law provides to social media and tech companies, as well as the increasing ability of these entities to influence the public discourse throughout the nation. Thomas’s disagreements with current interpretations of the internet statute were made apparent in a statement regarding the denial of certiorari in the Ninth Circuit case Malwarebytes, Inc. v. Enigma Software Grp (2020), where he noted that “...we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms” (id at 2).

Despite the contentious backlash that the law has received from both sides of the political spectrum, the law itself has created a well-crafted legal regime that has enabled internet providers to innovate and expand operations without the fear of costly litigation (Chander 2014: 640-43). Yet much of the current landscape on Section 230 may be altered, given the court’s recent approval of two consolidated cases - Gonzalez v. Google, LLC and Taamneh v. Twitter, Inc - which are slated for oral arguments in February. And while the court’s acceptance of the case has been troublesome to legal scholars, since the Justices typically do not show interest in statutory interpretation and given the scant degree of lower court disagreement on Section 230’s application (Barthold, 2022), the consolidated case has the potential to transform the extent of the liability protections the law provides to internet companies and the types of content-recommendation and other devices used by tech companies.

Section 230’s Applicability and the Facts of the Case

Regarded as one of the most important laws in tech policy, Section 230 was crafted to encourage the development of the internet amongst a slew of rising internet providers, while also creating a safe online environment for users (Neschke et al., 2022). With this understanding, lower courts have constructed a broad regime of liability protection for Internet entities, consistently shielding them from suits and other lawsuits filed by aggrieved parties. Many cases have simply been dismissed, given that the goal of the statute is to prevent internet companies from undergoing costly litigation and additional legal expenses (Brown, 2022). One area in which this has certainly been the case is with the content moderation systems used by companies like Twitter and Facebook to filter and target content to users. Under current interpretation of the law, online services have been able to increase their capacity to regulate the content on their platforms, not only through human and automated moderators but also through algorithms that process user data to make content recommendations and detect online abuse (Neschke et al., 2022). 

Many of Section 230’s liability protections apply to online services that simply disseminate and modify the content provided through users, as opposed to content developed by a service or website, which the law does not shield. Although this distinction may be hard to discern, and largely depends on the operation and functionality of products distributed by internet companies, courts will oftentimes look into how tools and other mechanisms are used for either neutral facilitation of discussion or to develop user-adapted content (Terr et al., 2022). In the consolidated case, the court will seek to determine if terrorist content, recommended to ISIS recruits by youtube’s algorithm, falls under the latter category and outside the scope of Section 230’s immunity protections (Neschke et al., 2022).

In both cases, the plaintiffs argue that Google and Twitter, respectively, violated the Anti-Terrorism Act and “aided and abetted” the operations of terror groups like ISIS. In Gonzales, the plaintiffs argue that Google, which owns Youtube, should be held liable for the death of an American woman killed in the 2015 Paris Terror Attack. More specifically, they allege that Google should be held liable for communicating ISIS messaging and for aiding in recruitment efforts through targeted content recommended to users on its site, and by failing to remove these materials from its platform (Rutledge & Kaliuzhna, 2022). In Tammnah, the plaintiffs claim that Twitter should be held liable for failing to take greater steps to remove messages, videos, and other content circulated by ISIS, after the death of a Jordanian citizen in Istanbul, Turkey in 2017 (ibid). In contrast, Google and Twitter maintain that the suits should be dismissed under Section 230, as the law bars liability for third-party content published on their platforms and given that the law also shields recommended content from liability suits, as ruled by the Second and Ninth Circuit courts (Neschke et al., 2022).

Ninth Circuit Ruling and its Significance

In Gonzales, a 2-1 panel sided with Google, finding that the company was protected under Section 230 from the claims brought by the plaintiffs. In particular, the court found that under its previous ruling in Fair Housing Council of San Fernando Valley v. Roomates.com (2008), any-and-all types of “neutral tools,” such as content recommendation algorithms were not to be categorized as content development, even in instances where they were used for unlawful purposes (id at 1167-68), and that internet content services do not become content providers by “augment[ing] the content generally (id at 1168). From this, the majority reasoned that Google could not be deemed as an internet content provider, since “...the core principle [of Google’s algorithms] stay the same: [they] select the particular content provided to a user based on that user's inputs (id at 895). Though acknowledging that "machine-learning algorithms can never produce content within the meaning of Section 230,” (id at 896) the majority nonetheless reiterated that Google’s content-neutral algorithms fell outside the scope of the law.

For Taamneh, the Ninth Circuit agreed with the plaintiffs, thereby permitting their lawsuit against Twitter to move forward, holding that due to the internet company’s negligence in “...allow[ing] ISIS accounts and content to remain public even after receiving complaints about ISIS's use of their platforms” (id at 910). Employing a three-factor test from the case Halberstam v. Welsh (1983), the majority concluded that due to the extent of ISIS’s activities on the platform, - including the raising of funds, recruitment of members, and dissemination of propaganda - the instrumental use of Twitter to reach audiences worldwide, and given that Twitter failed to detect ISIS-affiliated accounts when they appeared on the platform in 2010, the tech company could be held liable for damages under the Anti-Terrorism Act (id at 909-911). However, given that the district court did not consider the application of Section 230 to the case, the Ninth Circuit declined to consider its application to the plaintiff's allegations, only addressing the supposed violations of the Anti-Terrorism Act.

What Comes Next?

Though this will be the first time the Supreme Court is weighing in on a Section 230 dispute, lower courts have presented extensive case law, some of which presents similar facts to the present case, for a ruling on the case. In 2019 for instance, the Second Circuit ruled on a very similar case to Gonzalez in Force v. Facebook, in which U.S. citizens sought for Facebook to claim liability for Hamas attacks in Israel. In a 2-1 majority, the court found that Facebook’s use of a neutral algorithm to translate third-party information to users did not make it an internet content provider (id at 69). Similarly, in Dyroff v. Ultimate Software Group (2019), the Ninth Circuit ruled that an open-ended text box used to recommend groups to users, which was performed with an algorithm, could be categorized as a “neutral” tool, thereby exempting the service from liability after a user overdosed on fentanyl-laced heroin (id at 1099-1100).

While the justices may attempt to clarify what content recommendation devices qualify as neutral facilitation tools some, like Techdirt's Tim Cushing believe that algorithms cannot function without the function of users, and that just as tech companies can be granted immunity from videos posted on their site, so to can algorithmic recommendations (2021). If the court does narrow Section 230’s immunity protections, internet companies will likely embark upon greater content moderation, a financially-burdensome task for smaller firms (Brown, 2022), along with the dramatic restructuring of operations and business models. Consequently, a more enclosed internet space, dominated by the largest tech firms, will likely be the result under this new regime.

References

Barthold, Corbin. 2022. “Section 230 Heads to the Supreme Court.” Reason.com. November 4, 2022. https://reason.com/2022/11/04/section-230-heads-to-the-supreme-court/.

Brown, Elizabeth Nolan. “YouTube ISIS Videos Mean the Supreme Court Could Reconsider Section 230.” Reason Foundation, April 25, 2022. https://reason.com/2022/04/25/youtube-isis-videos-mean-the-supreme-court-could-reconsider-section-230/.

Chander, Anupam. "How Law Made Silicon Valley." Emory Law Journal 63 (2014): 640-43

Cushing, Tim. “Ninth Circuit Appeals Court Says Some Disturbing Stuff about Section 230 While Dumping Another Two ‘Sue Twitter for Terrorism’ Lawsuits.” Techdirt, June 30, 2021. https://www.techdirt.com/2021/06/30/ninth-circuit-appeals-court-says-some-disturbing-stuff-about-section-230-while-dumping-another-two-sue-twitter-terrorism/.

Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1099-1100 (9th Cir. 2019)

Fair v. Roommates, 521 F.3d 1157, 1167-68 (9th Cir. 2008)

Force v. Facebook, Inc., 934 F.3d 53, 69 (2d Cir. 2019)

Gonzalez v. Google LLC, 2 F.4th 871, 895-96, 909-11 (9th Cir. 2021)

Malwarebytes, Inc. v. Enigma Software Grp. U.S., 141 S. Ct. 13, 208 L. Ed. 2d 197, 2 (2020)

Neschke, Sabine, Danielle Draper, Sean Long, Sameer Ali, and Tom Romanoff. “Gonzalez v. Google: Implications for the Internet’s Future | Bipartisan Policy Center.” Bipartisan Policy Center, November 29, 2022. https://bipartisanpolicy.org/blog/gonzalez-v-google/.

Rutledge, Peter, and Ohla Kaliuzhna. “Social Media and the Anti-Terrorism Act Arrive at the Supreme Court.” Law.com Daily Report, November 15, 2022. https://www.law.com/dailyreportonline/2022/11/15/social-media-and-the-anti-terrorism-act-arrive-at-the-supreme-court/.

Terr, Aaron, J. T. Morris, Jared Mikulski, and Jessie Appleby. “FIRE’s Supreme Court Preview 2022-2023 | the Foundation for Individual Rights and Expression.” www.thefire.org, December 2, 2022. https://www.thefire.org/news/fires-supreme-court-preview-2022-2023.