People walk across the plaza of the U.S. Supreme Court building on the first day of the court’s new term in Washington, U.S. October 3, 2022.
Jonathan Ernst | Reuters
The Supreme Court on Monday stepped into the politically divisive issue of whether tech companies should have immunity over problematic content posted by users, agreeing to hear a case alleging that YouTube helped aid and abet the killing of an American woman in the 2015 Islamic State terrorist attacks in Paris.
The family of Nohemi Gonzalez, one of 130 people killed in a series of linked attacks carried out by the militant Muslim group, argued that YouTube’s active role in recommending videos overcomes the liability shield for internet companies that Congress imposed in 1996 as part of the Communications Decency Act.
The provision, Section 230 of the act, says internet companies are not liable for content posted by users. It has come under heavy scrutiny from the right and left in recent years, with conservatives claiming that companies are inappropriately censoring content and liberals saying that social media companies are spreading dangerous right-wing rhetoric. The provision leaves it to companies to decide whether certain content should be removed and does not require them to be politically neutral.
Gonzalez was a 23-year-old college student studying in France when she was killed while dining at a restaurant during the wave of attacks, which also targeted the Bataclan concert hall.
Her family is seeking to sue Google-owned YouTube for allegedly allowing ISIS to spread its message. The lawsuit targets YouTube’s use of algorithms to suggest videos for users based on content they have previously viewed. YouTube’s active role goes beyond the kind of conduct that Congress intended to protect with Section 230, the family’s lawyers allege. They say in court papers that the company “knowingly permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence” that helped the group recruit supporters, some of whom then conducted terrorist attacks. YouTube’s video recommendations were key to helping spread ISIS’s message, the lawyers say. The plaintiffs do not allege that YouTube had any direct role in the killing.
Gonzalez’s relatives, who filed their 2016 lawsuit in federal court in northern California, hope to pursue claims that YouTube violated a federal law called the Anti-Terrorism Act, which allows people to sue people or entities who “aid and abet” terrorist acts. A federal judge dismissed the lawsuit but it was revived by the San Francisco-based 9th U.S. Circuit Court of Appeals in a June 2021 decision that also resolved similar cases brought by the families of other terrorist attacks against tech companies.
Google’s lawyers urged the court not to hear the Gonzalez case, saying in part that the lawsuit would likely fail whether or not Section 230 applies.
The Supreme Court has previously declined to take up cases on Section 230, although conservative Justice Clarence Thomas has criticized it, citing the market power and influence of tech giants.
Another related issue is likely heading to the Supreme Court concerning a law enacted by Republicans in Texas that seeks to prevent social media companies from barring users who make inflammatory political comments. On Sept. 16, a federal appeals court upheld the law, which the Supreme Court in May prevented from going into effect.
In a separate move, the court also said it would hear a related appeal brought by Twitter on whether the company can be liable under the Anti-Terrorism Act. The same appeals court that handled the Gonzalez case revived claims brought by relatives of Nawras Alassaf, a Jordanian citizen killed in an Islamist attack in Istanbul in 2017. The relatives accused Twitter, Google and Facebook of aiding and abetting the spread of militant Islamic ideology. In that case, the question of Section 230 immunity had not yet been addressed.