Across the ideological spectrum, justices said they were confused by the arguments offered by the family’s lawyer and worried about making it easier for people to sue companies for the ways their algorithms sort and recommend material. They also expressed concern that the court could undermine an effort to provide immunity for the platforms that Congress passed decades ago, when lawmakers wanted to encourage the development of the internet.
Any decision by the court to keep broad immunity in place would be a major victory for tech companies, which say the protections are vital for allowing them to post content from outside parties. Critics say lower courts have afforded the industry more protection than Congress intended, absolving tech companies of responsibility for the hate speech and falsehoods that frequently litter their sites.
Supreme Court Justice Elena Kagan said one could question why Congress provided such immunity when passing Section 230 of the Communications Decency Act of 1996. But she drew laughter when she wondered how far the Supreme Court should go in cutting back such protection.
“We’re a court. We really don’t know about these things. You know, these are not like the nine greatest experts on the internet,” Kagan said.
The justices did not seem to think Eric Schnapper, the lawyer representing the family of Nohemi Gonzalez, had offered a coherent test to decide when tech companies immunized for third-party content on their sites could be liable for sorting and recommending the content there.
Kagan and Justice Brett M. Kavanaugh suggested a ruling on behalf of the Gonzalez family could unleash a wave of lawsuits. Kavanaugh did not seem persuaded when Deputy Solicitor General Malcolm L. Stewart, representing the Justice Department and siding in part with the plaintiffs, said few lawsuits “would have much likelihood of prevailing.”
Kavanaugh said Congress knows that lower courts have interpreted the protections broadly. “Isn’t it better … to put the burden on Congress to change that, and they can consider the implications and make these predictive judgments?” he asked Stewart.
Stewart’s position was that, while Section 230 protects YouTube for allowing ISIS-affiliated content on the site, recommending content through the use of algorithms and other features requires a different analysis, without blanket immunity.
Courts in the past have found the Section 230 law shields tech companies from culpability over the posts, photos and videos that people share on their services. Google argues that the law protects it from legal responsibility for the videos surfaced by its recommendation algorithms, and that such immunity is essential to tech companies’ ability to provide useful and safe content to their users.
“Helping users find the proverbial needle in the haystack is an existential necessity on the internet,” said Washington lawyer Lisa S. Blatt, who represented Google, which owns YouTube. “Search engines thus tailor what users see based on what’s known about users. So does Amazon, Tripadvisor, Wikipedia, Yelp!, Zillow, and countless video, music, news, job-finding, social media, and dating websites.”
The Gonzalez family’s lawyers say that interpretation of Section 230 incentivizes promoting harmful content and denies victims an opportunity to seek redress when they can show those recommendations caused injuries or even death.
Justice Clarence Thomas, who has been a critic of Big Tech companies and the protections they received, said Tuesday that he was unsure how YouTube could be said to be aiding and abetting terrorism when its “neutral” algorithms worked the same way whether a viewer was seeking information on the Islamic State or how to make rice pilaf.
Chief Justice John G. Roberts Jr. wondered whether recommending a similar video to someone who has expressed interest in a subject is not the “21st Century” equivalent of a bookseller pointing a customer asking about sports–related books to the section of the store where they are kept.
Justice Sonia Sotomayor and Kagan told Schnapper that his argument about algorithmic recommendations was very broad. Because algorithms are used to respond to virtually every search, Kagan said, Schnapper’s position might mean Section 230 really provides no protection at all.
Schnapper agreed algorithms are “ubiquitous” but noted the ones at issue involved YouTube recommending Islamic State videos.
Justice Ketanji Brown Jackson aggressively questioned lawyer Blatt, suggesting the original intent of Section 230 was to protect tech companies from liability but also to encourage them to take down offensive content.
But Blatt refused to make concessions. She held fast to her argument that Section 230 is broad, strong and crystal-clear: platforms are not liable when dealing with any kind of third-party content, regardless of how they do or don’t promote it to their users.
Some justices indicated that was extreme — Justice Amy Coney Barrett asked if the companies would be protected if their sorting mechanism was not neutral but “really defamatory or pro-ISIS?” Section 23o would still protect it, Blatt said.
Congress wrote Section 230 after a court found Prodigy Services liable for defamatory comments on its site. At the time, message boards reigned supreme and Americans were newly joining services such as CompuServe, Prodigy and AOL, allowing their unvetted posts to reach millions. The statute’s key provision says no tech platform “shall be treated as the publisher or speaker of any information provided by another information content provider.”
Google successfully quashed the Gonzalez family’s lawsuit in lower courts, arguing that Section 230 protects the company when it surfaces a video in the “Up Next” queue on YouTube, or when it ranks one link above another in search results. But these and other wins came over the objections of some prominent judges who say lower courts have read the provision too broadly.
The case comes amid growing concern that the laws that govern the internet — many forged years before the invention of social media platforms like Facebook, YouTube, Twitter or TikTok — are ill equipped to oversee the modern web. Politicians from both parties are clamoring to introduce new digital rules after the U.S. government has taken a largely laissez-faire approach to tech regulation over the last three decades. But efforts to craft new laws have stalled in Congress, pushing courts and state legislatures to take up the mantle.
Now, the Supreme Court is slated to play an increasingly central role. The justices on Wednesday will take up Twitter v. Taamneh, another case brought by the family of a terror-attack victim that alleges social media companies are responsible for allowing the Islamic State to use their platforms.
Barrett said the outcome of that case might be relevant to the Google lawsuit, and could dictate even whether the court has to settle the issues argued Tuesday.
In the term beginning in October, the court is likely to consider challenges to a law in Florida that would bar social media companies from suspending politicians, and a similar law in Texas that blocks companies from removing content based on a user’s political ideology.
U.S. Naval Academy law professor Jeff Kosseff, an expert on Section 230, said several of the justices appeared inclined on Tuesday to limit the protections the law provides, but did not yet show signs of consensus on what a new legal standard could look like.
“They really seemed to not … have a good idea of where they want to draw that line, because they recognize how difficult it is,” Kosseff said.
Kavanaugh, for example, worried that a bad decision could create “a lot of economic dislocation, would really crash the digital economy with all sorts of effects on workers and consumers, retirement plans and what have you.”
Mary Anne Franks, a University of Miami law professor who has proposed reforms to Section 230 to incentivize online content moderation, said some of the court’s questions suggested justices may be open to a more nuanced interpretation of the law than lower courts have so far embraced.
Section 230 was “intended to be a good Samaritan statute first and foremost,” Franks said, allowing online platforms to moderate content without fear of increasing their risk of liability. That’s in contrast to the expansive view many lower courts have taken, in which Section 230 is seen as giving platforms near-blanket immunity from any lawsuit arising from use of third-party content.
Franks said she thought that Jackson, in particular, “really brought that point home” with her questioning of Blatt as to just how far Section 230 immunity should stretch.
The case is Gonzalez v. Google.
Gerrit De Vynck contributed to this report.