My Blog
Technology

State dept. cancels Facebook meetings after judge’s ‘censorship’ ruling


One day after a Louisiana federal judge set limits on the Biden administration’s communications with tech firms, the State Department canceled its regular meeting Wednesday with Facebook officials to discuss 2024 election preparations and hacking threats, according to a person at the company. The move came hours before Biden’s Department of Justice filed a notice that it will appeal the ruling.

State Department officials told Facebook that all future meetings, which had been held monthly, have been “canceled pending further guidance,” said the person, who spoke on the condition of anonymity to preserve working relationships. “Waiting to see if CISA cancels tomorrow,” the person added, referring to the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency.

The person at Facebook said they presumed similar meetings the State Department had scheduled with other tech companies also were canceled, but that could not be confirmed immediately. Representatives for the State Department did not respond to a request for comment. CISA declined to comment, referring questions to the Justice Department. A Justice Department official declined to comment on the cancellations, but said the agency expects to request a stay of the District Court’s decision “expeditiously.”

Representatives for Google, which owns YouTube, and other social media companies did not immediately respond.

The cancellation of regular meetings between Facebook parent company Meta — the world’s largest social media firm — and U.S. government agencies shows the immediate impact of Tuesday’s order by U.S. District Judge Terry A. Doughty, a Trump appointee. The order is a win for the political right in a broader battle over the role of social media companies in shaping online speech and information.

While the ruling won’t stop sites such as Facebook, Instagram, YouTube or TikTok from moderating online content, it stands to sideline federal government officials and agencies that had become key contributors to those efforts. In the past, meetings between the State Department and Facebook in particular have flagged suspected foreign influence operations for the companies to investigate.

The canceled meetings show that the injunction is already affecting government efforts to protect elections, said another person who is familiar with them.

When tech companies and State Department officials meet, “They talk about foreign influence, they compare notes, it gives them the opportunity to ask questions about foreign influence they are seeing,” this person said. “State will share Russian narratives, things they are seeing in state media in Russia about U.S. topics. They will ask whether Facebook is seeing things from known entities, such as the Chinese Communist Party or the Internet Research Agency,” the Russian entity thought responsible for much of the interference in the 2016 election.

The person at Facebook confirmed that the meetings include two-way sharing of information on foreign influence operations.

A former Department of Homeland Security official, speaking on condition of anonymity because they feared legal or political retaliation, said they believed meetings are being canceled because general counsels at the various agencies are still parsing the implications of the 155-page ruling. Ultimately, many of the activities they pursued, such as warnings about election disinformation, are exempted from the injunction and are likely to continue, the person said.

“I would expect to see DOJ or the White House take the first public steps,” the former official said. “There will likely be a chilling effect from overly cautions government counsels. What previously had been in-bounds will look too close to the line, or we’re not sure how it’s going to work.”

Issued on the Fourth of July, the order found that the Biden administration probably violated the First Amendment in applying pressure to Facebook, YouTube, Twitter and other social media firms to restrict the viral spread of posts that stoked fears about coronavirus vaccines or fueled false claims related to U.S. elections.

Leading U.S. social media companies began coordinating regularly with the federal government in 2017, following revelations of a Russian campaign to sow discord among Americans during the 2016 presidential election campaign. Partnerships between Silicon Valley and Washington on what the tech companies call “content moderation” deepened and broadened during the pandemic, when platforms such as Twitter, Google’s YouTube, and Meta’s Facebook and Instagram became hotbeds for conspiracy theories about the virus and opposition to public health guidance.

The attorneys general of Missouri and Louisiana, along with a host of other plaintiffs, sued Biden and a bevy of government agencies and officials in 2022, alleging that they had cajoled and coerced the tech firms into removing or suppressing speech that is protected under the First Amendment. The Biden administration has argued that it did not violate the First Amendment, but rather used its bully pulpit to promote accurate information in the face of a public health crisis and foreign interference in U.S. elections.

On Tuesday, Doughty, who sought to block several Biden administration mandates during the pandemic, sided largely with the plaintiffs. He issued a preliminary injunction that prohibits several federal agencies and their employees from “meeting with social-media companies for the purpose of urging, encouraging, pressuring, or inducing in any manner the removal, deletion, suppression, or reduction of content containing protected free speech.”

Doughty’s decision left exceptions for the government to continue to communicate with tech companies, including carveouts for officials to warn Silicon Valley about criminal activity, foreign election interference and cyberattacks.

The White House did not immediately respond to a request for comment on the meeting cancellations. White House press secretary Karine Jean-Pierre said during a briefing with reporters on Wednesday that the administration disagrees with the injunction. The Department of Justice continues to review it and evaluate its options, she said.

Jean-Pierre said that the administration has been “consistent” in its dealings with tech firms and that it will “continue to promote responsible actions to protect public health, safety and security when confronted by challenges like a deadly pandemic and foreign attacks on our elections.”

“Our view remains that social media platforms have a critical responsibility to take action or to take account of the effects their platforms are having [on] the American people, but make independent choices about the information they present,” she said.

Meta, Twitter and Google have declined to comment on the injunction. But the judge’s decision creates uncertainty about the future of content moderation at the companies ahead of the 2024 elections and raises legal questions about how they will communicate with officials at all levels of government about falsehoods related to voting.

The Biden administration is likely to appeal the injunction before voters head to the polls next year. But in the interim, the order is poised to have a chilling effect on the companies’ efforts to guard against misinformation as they work to sort out what types of partnerships are still allowed.

Tech companies are already taking significant steps to unwind programs to combat disinformation on their services. Under the helm of Elon Musk, Twitter has slashed its Trust and Safety teams and initiatives. Amid financial pressure and company layoffs, Meta has also made cuts to similar teams.

“There is so much wrong with this decision — not least of all that it will make us less secure going into the 2024 elections,” wrote Yoel Roth, the former head of Trust and Safety at Twitter, in a social media post. Roth said the most glaring problem with the decision is that it asserts the companies were “coerced” to remove posts simply because they met with government officials. “That’s just … not how any of this works,” he wrote.

Roth’s work at Twitter has come under the glare of Republican politicians. He has said during testimony before Congress that Twitter independently made decisions to remove content its staffers believed violated its rules. He said the U.S. government “took extraordinary efforts” to ensure it did not even hint at demanding the company remove posts.

Emails produced as evidence in the case also show tech companies pushing back against the Biden administration, at times telling government officials that the videos or posts they flagged did not run afoul of their policies against misinformation. The companies’ decisions frequently appeared to prompt frustration among Biden White House officials.

In April 2021, then-White House adviser Andy Slavitt sent an email to Facebook staff with the subject line, “Tucker Carlson anti-vaccine message,” noting that it was “number one on Facebook.” Later that day, a Facebook staffer responded, saying the video did not qualify for removal under its policies. The employee said the company was demoting the video and labeling it with a “pointer” to accurate information about the vaccine.

The company’s decision to leave the video up prompted backlash from Rob Flaherty, a another former White House official, who responded: “Not for nothing but last time we did this dance, it ended in an insurrection.”

The lawsuit also named as defendants several academics and civil society organizations that had contributed to partnerships between the online platforms and the government. On Wednesday, researchers outside of government and companies were reeling from the injunction and still sorting out how to handle it.

“There’s no version of us being able to do our job, or other versions of the field of trust and safety, without being able to communicate with all stakeholders, including government and including industry,” said a leading researcher on extremism and foreign influence who asked not to be named due to the ongoing litigation.

Another researcher, who also spoke on condition of anonymity due to pending litigation, added: “Platforms had already gutted their trust and safety departments, and now they aren’t supposed to [talk to the] government.” They added, however, that “information sharing between platforms and government in this area was always fairly minimal.”

Doughty’s ruling is unlikely to be the last word on the question of what level of government pressure on platforms constitutes a First Amendment violation, said Jeff Kosseff, a cybersecurity law professor at the U.S. Naval Academy.

“The really tough question is when does the government cross the line from responding to speech — which it can and should do — to coercing platforms to censor constitutionally protected speech?” Kosseff said. “The judge here believes that line was crossed, and he certainly cited some persuasive examples,” such as administration officials suggesting antitrust actions against tech firms or changes to their liability protections while criticizing their content moderation efforts.

Related posts

How Tech Corporations Are Seeking to Woo Workers Returning to Paintings

newsconquest

Amazon AI customer review summaries aren’t as bad as you think

newsconquest

You Can Buy This Electric Flying Car for Under $100,000

newsconquest

Leave a Comment