In a 6-3 ruling, the court said the states and individuals could not show they were directly harmed by the communication between federal officials and social media platforms.
Writing for the majority, Justice Amy Coney Barrett said companies such as Facebook and YouTube have long-standing content-moderation policies that place warning labels on certain posts and delete others. The challengers, Barrett wrote, did not demonstrate that the companies’ actions to remove posts were traceable to the government.
Barrett said a lower court got it wrong when it “glossed over complexities in the evidence” by attributing to the Biden administration every company decision to remove or moderate content.
“While the record reflects that the Government defendants played a role in at least some of the platforms’ moderation choices, the evidence indicates that the platforms had independent incentives to moderate content and often exercised their own judgment,” she wrote.
Justice Samuel A. Alito Jr, joined by Justices Clarence Thomas and Neil M. Gorsuch, dissented.
Alito criticized his colleagues in the majority for failing to address the underlying free-speech questions at issue in the case, calling efforts by the government to police content it sees as problematic a form of “coercion.”
The court “shirks that duty and thus permits the successful campaign of coercion in this case to stand as an attractive model for future officials who want to control what the people say, hear and think,” Alito wrote. “That is regrettable.”
The case, known as Murthy v. Missouri, gave the Supreme Court an opportunity to shape how government officials interact with social media companies and communicate with the public online. The dispute is one of several before the justices this term that tests Republican-backed claims that social media companies are working with Democratic allies to silence conservative voices.
Wednesday’s ruling could have implications for the U.S. government’s efforts to combat foreign disinformation during a critical election year when nearly half of the world’s population will go to the polls. The U.S. government largely halted its warnings to U.S. tech companies about foreign influence campaigns last year, after lower-court decisions that placed broad limits on such communications. As the 2024 presidential elections approach, the FBI has resumed some limited communications with the companies, according to people familiar with the matter, who spoke on the condition of anonymity to discuss internal affairs.
White House press secretary Karine Jean-Pierre said the court’s ruling ensures the administration can engage with social media and other tech companies on topics including terrorism threats, foreign influence campaigns, online harassment and mental health of children.
“Going forward, we will not back down from our consistent view that, while social media companies make independent decisions about the information they present, those companies have a critical responsibility to take into account the effects their platforms are having on the American people and the security of this nation,” she said in a written statement.
But Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University, said the justices missed an opportunity to give clear guidance to tech companies and the federal government about how the First Amendment should apply to social media. Jaffer said the court needs to clarify how the line should be drawn between “legitimate government persuasion” and “illegal government coercion.”
“Government officials are going to be operating in a kind of gray area,” said Jaffer, whose organization filed a brief in the case in support of neither party. “There are dangers in both directions, that’s why we needed guidance from the Supreme Court.”
The ruling is a blow to a wide-ranging conservative legal campaign, which alleges that the federal government and tech companies have colluded to censor Republican views online. Rep. Jim Jordan (R-Ohio), the chair of the House Judiciary Committee, has been running a parallel investigation in Congress, probing the interactions between tech companies, the federal government and researchers. He said in a statement he disagreed with the court’s ruling and plans to continue with his investigation.
“The First Amendment is first for a reason, and the freedom of expression should be protected from any infringement by the government,” Jordan said in a statement. “Our important work will continue.”
The litigation and the congressional investigations have already throttled a host of efforts to study misinformation online, with researchers saying the probes have exposed them to increased legal costs and personal attacks. The Stanford Internet Observatory, one of the most prominent institutions tracking online falsehoods, collapsed this month after the university incurred millions of dollars in legal fees related to the investigations and lawsuits. The Election Integrity Partnership, which the observatory operated in conjunction with the University of Washington, announced that it would not continue its work tracking voter suppression and election denial in the 2024 race or future elections. Federal agencies have also pulled back. Last year, the National Institutes of Health froze a $150 million program intended to advance the communication of medical information, citing regulatory and legal threats.
Democrats, who say the government must be able to work with the private sector to keep dangerous false information from reaching the public, called on Jordan to halt his investigation in response to the Supreme Court’s findings.
“I hope that after this humiliating defeat Chairman Jordan and his colleagues will end their failed investigation into the companies, universities, and individuals who have been trying to stop the spread of harmful misinformation and disinformation on social media,” said Jerry Nadler, the top Democrat on the House Judiciary Committee.
The First Amendment prevents the government from censoring speech and punishing people for expressing different views. But the Biden administration told the court that officials are entitled to share information, participate in public debate and urge action, as long as their requests to remove content are not accompanied by threats.
Top industry groups representing major social media companies, including NetChoice and the Chamber of Progress, praised the Supreme Court for recognizing that the platforms have their own incentives to moderate content that are not necessarily influenced by the government.
“What we see in this decision is that the court actually understands how content moderation works,” said Jess Miers, senior legal advocacy counsel for Chamber of Progress, an industry coalition that includes Google, Meta and other companies.
“Platforms have an important reason to seek information from actors like the CDC or national security leaders, but at the end of the day, their content moderation decisions and platform policies are their own,” she said, referring to the Centers for Disease Control and Prevention.
The attorneys general of Missouri and Louisiana had argued that the federal government coerced social media companies to suppress speech of individual users and became too deeply involved in the companies’ decisions to remove certain content. Tech companies, they said, cannot act on behalf of the government to remove speech the government doesn’t like.
The record before the Supreme Court in Murthy v. Missouri included email messages between Biden administration officials and social media companies, including Facebook’s parent company, Meta, and Twitter. Those messages showed tense conversations in 2021 as the White House and public health officials campaigned for Americans to get the coronavirus vaccine.
On Wednesday, lawyers representing the individuals behind the lawsuit criticized the court for determining “against all evidence that the Federal Government will not be held accountable for the natural consequences of its speech squelching actions.”
“The Government can press third parties to silence you, but the Supreme Court will not find you have standing to complain about it absent them referring to you by name apparently,” John Vecchione, senior litigation counsel to the New Civil Liberties Alliance, said in a statement.
The justices were reviewing lower-court decisions that strictly limited federal employees from communicating with tech giants to remove harmful posts or misinformation. A district court judge in Louisiana ruled against the Biden administration and barred thousands of federal employees from improperly influencing tech companies to remove certain content.
The U.S. Court of Appeals for the 5th Circuit narrowed that decision to a smaller set of government officials and agencies, including the surgeon general’s office, the White House, the CDC and the FBI. A three-judge panel of the appeals court found that the White House “significantly encouraged the platforms’ decisions by commandeering their decision-making processes, both in violation of the First Amendment.”
Missouri Attorney General Andrew Bailey (R) said in a news release that he would continue the litigation in the lower courts and that his office is “evaluating all other options” to address allegations of censorship.
“My rallying cry to disappointed Americans is this: Missouri is not done. We are going back to the district court to obtain more discovery in order to root out Joe Biden’s vast censorship enterprise once and for all,” Bailey said.
Joseph Menn and Tyler Pager contributed to this report.