Meta acknowledged in a statement to The Washington Post that Threads is intentionally blocking the search terms and said that other terms are being blocked, but the company declined to provide a list of them. A search by The Post discovered that the words “sex,” “nude,” “gore,” “porn,” “coronavirus,” “vaccines” and “vaccination” are also among blocked words.
“The search functionality temporarily doesn’t provide results for keywords that may show potentially sensitive content,” the statement said, adding that the company will add search functionality for terms only “once we are confident in the quality of the results.”
Lucky Tran, director of science communication at Columbia University, discovered this himself when he attempted to use Threads to seek out research related to covid, something he says he does every day. “I was excited by search [on Threads],” he said. “When I typed in covid, I came up with no search results.”
Other public health workers criticized the company’s decision and said its timing was especially poor, given the current coronavirus uptick. Hospitalizations jumped nearly 16 percent in the United States last week and have been rising steadily since July, according to CDC data, though they remain less than what they were for the comparable week a year ago. Deaths are less than a quarter of what they were year to year, CDC statistics show.
Julia Doubleday, outreach director at the World Health Network, a nonprofit dedicated to fighting the coronavirus, said: “Social media is a lifeline for patients, literally. Long covid patients have died of organ failure, infections, cardiac events and more, and social media is one place they can share information. Cutting off communication between suffering and disabled patients is cruel in the extreme. It’s indefensible.”
“The decision to censor searches about covid will make it harder for public health experts and people who work in public health to get out important info to the public about how they can protect themselves,” Tran said.
In a 2021 survey by the Pew Research Center, about 4 in 10 American adults said social media was an important source of coronavirus vaccine news.
“Censoring searches for covid and long covid will only leave an information gap that will be filled by misinformation from elsewhere,” Tran said. “The best solution is to take proactive steps to elevate multiple trusted sources and address misinformation.”
The incident is the latest indication that Meta, which owns Facebook and Instagram, of which Threads is a part, is seeking to avoid controversy on Threads. In July, Instagram CEO Adam Mosseri said that Threads is “not going to do anything to encourage” politics and “hard news,” and that “the goal isn’t to replace Twitter.”
The ability to share real-time news and information, however, was crucial to the rise of Twitter and remains one of its core functionalities.
“Ever since Elon [Musk] took over Twitter, people with long covid have been experiencing more harassment, and it’s been harder to connect on there,” said Fiona Lowenstein, editor of “The Long COVID Survival Guide,” a book about managing long covid symptoms.
“The whole reason we know about long covid in the first place is because people with long covid took to social media and started talking about their experiences,” Lowenstein said, noting that the term “long covid” itself was coined by a Twitter user before being adopted by the CDC, the World Health Organization and other health organizations.
Emily Vraga, an associate professor at the University of Minnesota’s Hubbard School of Journalism and Mass Communication, said the decision to block search results for important keywords “does not situate Threads as a replacement for the Twitter that once existed.”
The decision, Vraga said, was indicative of Meta’s apparent inability to meaningfully moderate content at scale.
“Meta and all of its products have long had a hands-off approach,” she said. “They really don’t want to be seen as deciding truth versus not truth, and I think this is a continuation of that. They are often sidestepping the really complicated and very difficult [moderation] decisions.”
Hany Farid, a professor at the University of California at Berkeley who specializes in technology and disinformation, said that blocking search results for certain terms does at least show that Meta is thinking about disinformation, though he called blocking search terms an imprecise moderation method.
“All this talk about AI and large language models and all these amazing technological innovations,” he said, “and one of the top tech companies in the world is resorting to these really crude instruments for content moderation.” Farid said the decision could be a sign that Meta “is continually not investing in doing better content moderation, so they’re resorting to this very blunt instrument.”
Blocking certain words from search outright is also ultimately ineffective, Farid said, because users will quickly develop euphemisms and turns of phrase to get around them.