My Blog
Business

Science and Nature studies on Facebook show algorithm not only problem

Science and Nature studies on Facebook show algorithm not only problem
Science and Nature studies on Facebook show algorithm not only problem



For all the blame Facebook has received for fostering extreme political polarization on its ubiquitous apps, new research suggests that the problem may not strictly be a function of the algorithm.

In four studies published Thursday in the academic publications Science and Nature, researchers from several institutions including Princeton University, Dartmouth College and the University of Texas collaborated with Meta to probe the impact of social media on democracy and the 2020 presidential election.

related investing news

Wall Street analysts become even more bullish on Meta after its latest earnings report

CNBC Pro

The authors, who received direct access to certain Facebook and Instagram data for their research, paint a picture of a vast social network comprised of users who often seek news and information that conforms to their existing beliefs. Thus, people who wish to live in so-called echo chambers can easily do so, but that’s as much about the stories and posts they’re trying to find as it is the company’s recommendation algorithms.

In one of the studies in Science, the researchers showed what happens when Facebook and Instagram users see content via a chronological feed rather than an algorithm-powered feed.

Doing so during the three-month period, “did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes,” the authors wrote.

In another Science article, researchers wrote that “Facebook, as a social and informational setting, is substantially segregated ideologically — far more than previous research on internet news consumption based on browsing behavior has found.”

In each of the new studies, the authors said that Meta was involved with the research but the company didn’t pay them for their work and they had freedom to publish their findings without interference.

One study published in Nature analyzed the notion of echo chambers on social media, and was based on a subset of over 20,000 adult Facebook users in the U.S. who opted into the research over a three-month period leading up to and after the 2020 presidential election.

The authors learned that the average Facebook user gets about half of the content they see from people, pages or groups that share their beliefs. When altering the kind of content these Facebook users were receiving to presumably make it more diverse, they found that the change didn’t alter users’ views.

“These results are not consistent with the worst fears about echo chambers,” they wrote. “However, the data clearly indicate that Facebook users are much more likely to see content from like-minded sources than they are to see content from cross-cutting sources.”

The polarization problem exists on Facebook, the researchers all agree, but the question is whether the algorithm is intensifying the matter.

One of the Science papers found that when it comes to news, “both algorithmic and social amplification play a part” in driving a wedge between conservatives and liberals, leading to “increasing ideological segregation.”

“Sources favored by conservative audiences were more prevalent on Facebook’s news ecosystem than those favored by liberals,” the authors wrote, adding that “most sources of misinformation are favored by conservative audiences.”

Holden Thorp, Science’s editor-in-chief said in an accompanying editorial that data from the studies show that “the news fed to liberals by the engagement algorithms was very different from that given to conservatives, which was more politically homogeneous.”

In turn, “Facebook may have already done such an effective job of getting users addicted to feeds that satisfy their desires that they are already segregated beyond alteration,” Thorp added.

Meta tried to spin the results favorably after enduring years of attacks for actively spreading misinformation during past U.S. elections.

Nick Clegg, Meta’s president of global affairs, said in a blog post that the studies “shed new light on the claim that the way content is surfaced on social media — and by Meta’s algorithms specifically — keeps people divided.”

“Although questions about social media’s impact on key political attitudes, beliefs, and behaviors are not fully settled, the experimental findings add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or have meaningful effects on these outcomes,” Clegg wrote.

Still, several authors involved with the studies conceded in their papers that further research is necessary to study the recommendation algorithms of Facebook and Instagram and their effects on society. The studies were based on data gleaned from one specific, short time frame coinciding with the 2020 presidential election, and further research could unearth more details.

Stephan Lewandowsky, a University of Bristol psychologist, was not involved with the studies but was shown the findings and given the opportunity to respond to Science as part of the publication’s package. He described the research as “huge experiments,” that shows  “that you can change people’s information diet, but you’re not going to immediately move the needle on these other things.”

Still, the fact that the Meta participated in the study could influence how people interpret the findings, he said.

“What they did with these papers is not complete independence,” Lewandowsky said. “I think we can all agree on that.”

Watch: CNBC’s full interview with Meta chief financial officer Susan Li

Related posts

Schumer, Jeffries pressure Murdoch, Fox News on election claims

newsconquest

Fed Governor Bowman sees ‘similarly sized’ rate hikes ahead after three-quarter point moves

newsconquest

Bank of America (BAC) earnings 4Q 2023

newsconquest