The project’s director, Joan Donovan, one of the country’s most widely cited experts on digital “media manipulation,” is not a faculty member and therefore could not continue to lead the project, Gibbs said.
Donovan, whose title is research director, is regarded as a member of the Shorenstein Center’s staff; it’s unknown whether she had been given the option to assume a faculty role during her time at Harvard.
Kennedy School officials declined to comment on personnel matters. Donovan declined to comment.
The move was first reported by the Harvard Crimson, the school’s student-led publication.
The project’s sunsetting in the months before the 2024 election marks a surprise development in what has been one of the American research world’s hottest topics: how the interplay of technology, political opportunism and unwitting internet users has shaped public conversation and democratic debate.
But it also comes as the field of study into what’s known as “misinformation,” supercharged by the Trump presidency, enters a new era.
Twitter, now led by the meme-sharing billionaire Elon Musk, has worked to end the platform’s long-standing openness to free, real-time research, announcing late Wednesday that on Feb. 9 it will begin charging for automated access to its data through its Application Programming Interfaces, a move that will hurt both developers and researchers.
Some researchers also have faced harassment online or been criticized by Republican lawmakers over claims their work is skewed by a liberal agenda. The platforms they study have changed, too, away from Facebook and Twitter to places like TikTok, Discord and Twitch, which present new challenges for data gathering, analysis and debate.
At the same time, more governments and commercial enterprises are waging information campaigns, and getting better at it, said Lisa Kaplan, chief executive of misinformation tracker Alethea Group.
“The enhanced operational security tactics employed by sophisticated actors and the proliferation of platforms with varying community standards and enforcement, combined with various access to data has ultimately changed the nature of the threat,” Kaplan said. “Reducing free access to limited and research APIs also inhibits the academic community to study these types of threats.”
Harvard’s move came as a shock to Donovan’s supporters, including Craig Newmark, the philanthropist founder of Craigslist, who said he was trying to learn why her project was being shut down after he had donated $5 million to it.
“Joan Donovan’s work must continue, for our national security,” he said. “She is defending the country against people who want to do us harm.”
“It’s such a crushing and damaging decision based on all the good work the program has accomplished,” said Chris Gilliard, a scholar of technology and surveillance who served as a visiting research fellow at the center last year. Gilliard is now a Just Tech Fellow at the Social Science Research Council, a century-old Brooklyn-based nonprofit that encourages public policy research.
Kennedy School officials said the school would continue to study similar issues through other projects, including one known as the Misinformation Review.
One project linked to Donovan’s group, devoted to the publication of internal Facebook files that whistleblower Frances Haugen shared with journalists in 2021, will continue to operate under the leadership of Kennedy School professor Latanya Sweeney, school officials said.
Harvard’s schools operate largely independently and house major research centers with specific topical focuses. Donovan’s group had become one of the more publicly visible inside the Shorenstein Center, due in part to Donovan’s frequent appearances in the news media and before Congress.
Laura Edelson, a postdoctoral researcher in computer science at New York University, said Donovan had made a major contribution to the field.
“The science is getting better and moving forward,” she said. “That isn’t going to happen if we don’t continue to have access to data, and researchers aren’t allowed to continue to do work even when it threatens major tech platforms.”
Will Oremus, Aaron Schaffer and Cat Zakrzewski contributed to this report.