CNN
—
The Supreme Court on Tuesday is set to hear oral arguments in the first of two cases this week with the potential to reshape how online platforms handle speech and content moderation.
The oral arguments on Tuesday are for a case known as Gonzalez v. Google, which zeroes in on whether the tech giant can be sued because of its subsidiary YouTube’s algorithmic promotion of terrorist videos on its platform.
According to the plaintiffs in the case — the family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris — YouTube’s targeted recommendations violated a US antiterrorism law by helping to radicalize viewers and promote ISIS’s worldview.
The allegation seeks to carve out content recommendations so that they do not receive protections under Section 230, a federal law that has for decades largely protected websites from lawsuits over user-generated content. If successful, it could expose tech platforms to an array of new lawsuits and may reshape how social media companies run their services.
“I just don’t want my daughter’s life to be washed out like that. I want something to be done,” said Beatriz Gonzalez, Nohemi’s mother, in an interview with CNN. “We’re searching for justice. Somebody has to be responsible for what happened. Not only to me, but to many other families that have lost their loved ones.”
Nitsana Leitner, the Gonzalez family’s attorney, told CNN that Google should be held liable because by allowing ISIS videos to circulate on the platform, the company profited from the terrorist group’s activities.
“If you use the content for your benefit, you have to pay for your wrongdoing,” Leitner said.
Google and other tech companies have said that exempting targeted recommendations from Section 230 immunity would increase the legal risks associated with ranking, sorting and curating online content, a basic feature of the modern internet. Google has claimed that in such a scenario, websites would seek to play it safe by either removing far more content than is necessary, or by giving up on content moderation altogether and allowing even more harmful material on their platforms.
Friend-of-the-court filings by Craigslist, Microsoft, Yelp and others have suggested that the stakes are not limited to algorithms and could also end up affecting virtually anything on the web that might be construed as making a recommendation. That might mean even average internet users who volunteer as moderators on various sites could face legal risks, according to a filing by Reddit and several volunteer Reddit moderators.
Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, the original co-authors of Section 230, argued to the Court that Congress’ intent in passing the law was to give websites broad discretion to moderate content as they saw fit.
The Biden administration has also weighed in on the case. In a brief filed in December, it argued that Section 230 does protect Google and YouTube from lawsuits “for failing to remove third-party content, including the content it has recommended.” But, the government’s brief argued, those protections do not extend to Google’s algorithms because they represent the company’s own speech, not that of others.
On Wednesday, the Court will hear arguments in a second case, Twitter v. Taamneh. It will decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms have hosted user content that expresses general support for the group behind the violence without referring to the specific terrorist act in question.
Rulings for both case are expected by the end of June.
– CNN’s Jessica Schneider contributed to this report.