My Blog
Technology

Supreme Court to hear Google case that could transform the internet

Supreme Court to hear Google case that could transform the internet
Supreme Court to hear Google case that could transform the internet



Comment

In November 2015, three rifle-wielding ISIS gunmen opened fire at restaurant in Paris, killing 23-year-old Nohemi Gonzalez, a college exchange student. Almost eight years later, her family is seeking justice for her death, targeting not the gunmen, but the tech giant YouTube, in a landmark case that could shift the foundations of internet law.

The Supreme Court on Tuesday will hear oral arguments in Gonzalez vs. Google, a lawsuit that argues tech companies should be legally liable for harmful content that their algorithms promote. The Gonzalez family contends that by recommending ISIS-related content, Google’s YouTube acted as a recruiting platform for the group in violation of U.S. laws against aiding and abetting terrorists.

At stake is Section 230, a provision written in 1996, years before the founding of Google and most modern tech giants, but one that courts have found shields them from culpability over the posts, photos and videos that people share on their services.

Terrorists killed their daughter. Now they’re fighting Google in the Supreme Court.

Google argues that Section 230 protects it from legal responsibility for the videos that its recommendation algorithms surface, and that such immunity is essential to tech companies’ ability to provide useful and safe content to their users.

The Gonzalez family’s lawyers say that applying Section 230 to algorithmic recommendations incentivizes promoting harmful content, and that it denies victims an opportunity to seek redress when they can show those recommendations caused injuries or even death.

The last surviving part of the Telecommunications Act of 1996, which provides companies legal cover to host others’ content, could be coming to an end. (Video: Jonathan Baran/The Washington Post)

The resulting battle has emerged as a political lighting rod because of its potential implications for the future of online speech. Recommendation algorithms underlie almost every interaction people have online, from innocuous song suggestions on Spotify to more nefarious prompts to join groups about conspiracy theories on Facebook.

Section 230 is “a shield that nobody was able to break,” Nitsana Darshan-Leitner, the president and founder of Shurat HaDin, an Israeli law center that specializes in suing companies that aid terrorists, and one of the lawyers representing the Gonzalez family, said in an interview. “It gave the social media companies the belief that they’re untouchable.”

YouTube parent company Google has successfully quashed the Gonzalez family lawsuit in lower courts, arguing that Section 230 protects the company when it surfaces a video in the “Up Next” queue on YouTube, or when it ranks one link above another in search results.

But these wins have come over the objections of some prominent judges who say lower courts have read Section 230’s protections too broadly. “The Supreme Court should take up the proper interpretation of Section 230 and bring its wisdom and learning to bear on this complex and difficult topic,” wrote Judge Ronald M. Gould of the U.S. Court of Appeals for the 9th Circuit.

Google general counsel Halimah DeLaine Prado said the Supreme Court’s review risks opening up the entire tech industry to a new onslaught of lawsuits, which could make it too costly for some small businesses and websites to operate. “It goes beyond just Google,” DeLaine Prado said. “It really does impact the notion of American innovation.”

The case comes amid growing concern that the laws that govern the internet — many forged years before the invention of social media platforms like Facebook, YouTube, Twitter or TikTok — are ill equipped to oversee the modern web. Politicians from both parties are clamoring to introduce new digital rules after the U.S. government has taken a largely laissez-faire approach to tech regulation over the last three decades. But efforts to craft new laws have stalled in Congress, pushing courts and state legislatures to take up the mantle.

Now, the Supreme Court is slated to play an increasingly central role. After hearing the Google case on Tuesday, the justices on Wednesday will take up Twitter v. Taamneh, another case brought by the family of a terrorist attack victim alleging social media companies are responsible for allowing the Islamic State to use their platforms.

And in the term beginning in October, the court is likely to consider challenges to a law in Florida that would bar social media companies from suspending politicians, and a similar law in Texas that blocks companies from removing content based on a user’s political ideology.

Supreme Court asks Biden administration to weigh in on social media case

“We’re at a point where both the courts and legislators are considering whether they want to continue to have a hands-off approach to the internet,” said Jeff Kosseff, a cybersecurity law professor at the United States Naval Academy and the author of “The Twenty-Six Words That Created The internet.”

Section 230 was crafted following litigation with early internet companies, when one court found Prodigy Services liable for defamatory comments on its site. At the time, message boards reigned supreme and Americans were newly joining services such as CompuServe, Prodigy, and AOL, allowing their unvetted posts to reach millions.

After the decision, Congress stepped in to ensure the judgment did not stifle innovation on the fledgling internet. The result was Section 230.

Congress is weighing changes to Section 230, again. Here are what bills stand a chance.

The key portion of Section 230 is only 26 words long and says no tech platform “shall be treated as the publisher or speaker of any information provided by another information content provider.”

The seemingly innocuous law, which was part of the 1996 Communications Decency Act, received little media attention or fanfare when it was first drafted. Yet it has become increasingly controversial as it has been dragged into contentious battles over what content should remain on social media.

Over the last half a decade, members of Congress have put forward dozens of proposals to either repeal the law or create carve outs requiring tech companies address harmful content, like terrorism or child sex exploitation, on their platforms.

Former president Donald Trump and President Biden have criticized the provision, calling for its repeal, but for different reasons. Democrats largely argue that Section 230 allows tech companies to duck responsibility for the hate speech, misinformation and other problematic content on their platforms. Republicans, meanwhile, allege companies take down too much content, and have sought to address long-running accusations of political bias in the tech industry by altering the provision.

“Part of the ‘why now’ is that we’ve all woken up 20 years later, and the internet is not great,” said Hany Farid, a professor at the University of California, at a recent event hosted by the Brookings Institution.

Some Supreme Court justices have signaled a growing interest in grappling with the future of online speech — though not specifically the challenge in the Gonzalez case of algorithmic recommendations. Supreme Court justice Clarence Thomas said in 2020 that it “behooves” the court to find a proper case to review Section 230. He suggested that courts have broadly interpreted the law to “confer seeping immunity on some of the largest companies in the world.” In a 2021 opinion, Thomas suggested that the ability of social media platforms to remove speech could raise First Amendment violations, and government regulation could be warranted.

The Technology 202: Clarence Thomas takes on social media companies’ power

But the key question in Gonzalez — whether the providers are immunized when their algorithms target and recommend specific content — has not been Thomas’s focus. He and Justice Samuel A. Alito Jr. have expressed more concern about decisions by providers to take down content or ban speakers. Those issues will be raised more clearly when the court confronts laws from Florida and Texas that provide such regulation. The lower courts are divided on the constitutionality of the laws, and the court has asked the Biden administration to weigh in on whether to review the laws.

Alito, joined by Thomas and Justice Neil M. Gorsuch, last year made clear they expect the court to review laws that address “the power of dominant social media corporations to shape public discussion of the important issues of the day.”

Some legal experts argue that legislators in the 1990s could never have anticipated how the modern internet could be abused by bad actors, including terrorists. The same Congress that passed Section 230 also passed anti-terrorism laws, said Mary B. McCord, the executive director for the Georgetown Law Center Institute for Constitutional Advocacy and Protection during a briefing for reporters.

“It’s implausible to think that Congress could have been thinking to cut off civil liability completely … for people who are victims of terrorism at the same time they were passing renewed and expanded legal authorities to combat terrorism,” she said.

Yet other legal experts expressed skepticism of a heavy-handed approach to tech regulation. Kosseff, the cybersecurity law professor, warned the push to use the power of government to address problems with the internet may be “really short sighted.”

“Once you give up power to the government over speech, you’re not getting it back,” he said.

‘Upending the modern internet’

The majority of the 75 amicus briefs filed by nonprofits, legal scholars and businesses favor Google. Groups or individuals that receive funding from Google produced 37 briefs and nine others came from other tech companies whose business would be impacted by changes to Section 230, including Facebook parent company Meta and Twitter.

A brief submitted by the provision’s original authors, Sen. Ron Wyden (D-Ore.) and former Rep. Christopher Cox, argues Section 230, as originally crafted, protects targeted recommendations. Wyden and Cox say the recommendation systems that YouTube uses today aren’t that different from the decisions platforms were making at the time 230 was written.

They “are the direct descendants of the early content curation efforts that Congress had in mind when enacting Section 230,” they wrote.

But the Biden administration is siding, at least in part, with the Gonzalez plaintiffs. While Section 230 protects YouTube for allowing ISIS-affiliated content on the site, the government says, recommending content through the use of algorithms and other features requires a different analysis, without blanket immunity.

Google disputes that recommendations are endorsements. “Recommendation algorithms are what make it possible to find the needles in humanity’s largest haystack,” Google tells the court. “Given that virtually everyone depends on tailored online results, Section 230 is the Atlas propping up the modern internet — just as Congress envisioned in 1996.”

Farid said that in the Gonzalez case, the justices are grappling with many of the problems in the tech industry that have emerged over the last decade. He said there’s a growing urgency to address harms online as technology accelerates, especially with the recent boom in artificial intelligence.

“We need to do better in the future,” Farid said. “We need to get out ahead of these problems and not wait until they get so bad that we start overreacting.”

Related posts

Prime Video: The 32 Absolute Best TV Shows to Watch

newsconquest

I Got to Play With Apple’s New M4 Mac Mini, iMac and MacBook Pro

newsconquest

Grab New and Used Gaming Laptops in This Massive Woot Sale

newsconquest