My Blog
Technology

Facebook Oversight Board says company should be more transparent in report

Facebook Oversight Board says company should be more transparent in report
Facebook Oversight Board says company should be more transparent in report



Placeholder while article actions load

More than a year after its creation, the Facebook Oversight Board argued in the first of what are to be annual reports that the social media company should be far more transparent about how it decides which posts and accounts to leave up and which to take down.

The board, an international panel of human rights advocates, politicians and academics that oversees Facebook’s thorniest content moderation decisions, said the company had made some progress in implementing the board’s policy recommendations but needed to share more information about content removal systems. The group took aim at the opaque nature of the company’s strikes system, which gives users who break the platform’s content guidelines a specific number of passes and a tiered system of punishments before their accounts are suspended.

“The Board is encouraged by first-year trends in its engagement with Meta, but the company must urgently improve its transparency,” the group said in the report. “The Board continues to have significant concerns, including around Meta’s transparency and provision of information related to certain cases and policy recommendations.”

Last year, Facebook renamed its parent company Meta. Meta spokesperson Dan Chaison said in a statement that the board’s decisions have “resulted in substantive changes to our content moderation policies and enforcement practices.”

“We know that we will always have room for improvement, and we are grateful for the board’s thoughtful input on the most significant and difficult questions we face,” Chaison said.

Facebook conceived the Oversight Board as an experiment, as regulators around the world were trying to craft uniform rules governing social media platforms. The company argued that the board could chart direction for content policy and be a model for other companies’ governance structures.

But critics have asked whether a board given no formal authority and serving at the pleasure of the company has enough power to force Facebook to follow its recommendations for issues plaguing its platform, including misinformation and hate speech. While the board has offered independent supervision of the company, it is dependent on Facebook to give it information, funding and the power to make change.

The annual report sheds light on some of the challenges facing the group as it makes critical decisions about how the company should support the free expression of its users while mitigating the harms of problematic speech. In public comments and case decisions, the board has repeatedly chastised Facebook for not giving the Oversight Board and users enough information to evaluate the company’s content moderation systems.

Facebook’s ban on gun sales gives sellers 10 strikes before booting them

Since its creation, the board has ruled in an array of cases, including deciding that an Instagram user’s breast cancer awareness post violated the companies’ rules against nudity, and opining on whether Facebook should have suspended the account of then-President Donald Trump over his role in the Jan. 6 attack on the U.S. Capitol.

The board said Wednesday that of the 20 cases the company and users referred to it in 2021, it overturned Facebook 14 times and upheld six of its decisions.

In the case of Trump, the board affirmed Facebook’s decision to suspend the former president but told the company it must clarify its policies about the penalties for rule-breaking politicians and make the final decision on whether Trump could return to the platform. Facebook eventually decided to suspend Trump for two years, opening the door for him to return to the site before the 2024 presidential election.

Under the rules, Facebook and its users are allowed to appeal to the Oversight Board cases in which the company has taken down posts for violating its community standards — rules it imposes against hate speech, harassment and other problematic types of content. The decisions the Oversight Board makes on these cases are considered binding.

Separately, the Oversight Board can issue policy recommendations for changes to the company’s content moderation systems, but those are not considered binding.

Overall, Meta committed to at least partially implementing two-thirds of the board’s 86 policy recommendations, according to the report. For the remainder of the recommendations, Meta said it either already does the work suggested, would not take action or would assess the feasibility of implementing the board’s policy suggestion.

Among the most common recommendations, the board urged Facebook to give users more information about the rules they are breaking when their content is removed.

“Our recommendations have repeatedly urged Meta to follow some central tenets of transparency,” the board said in the report. “Make your rules easily accessible in your users’ languages; tell people as clearly as possible how you make and enforce your decisions; and, where people break your rules, tell them exactly what they’ve done wrong.”

Facebook is now tells English-speaking users when their content is removed for hate speech and is testing that policy for content in Arabic, Spanish and Portuguese, as well as for posts removed for bullying and harassment, the report said.

The Oversight Board also launched an implementation committee to evaluate whether the company actually is making the policy changes it says it will in response to the board’s policy recommendations, the board said.

Facebook Oversight Board sternly criticizes the company’s collaboration in first transparency reports

Tension between the Oversight Board and Facebook flared last fall when the board chastised Facebook for its lack of transparency about a program meant to exempt famous people from facing penalties over posts that break the company’s content rules. At the time, citing Facebook internal documents, the Wall Street Journal had reported that while the company told the board that the program affects only a “small number of decisions,” it actually included at least 5.8 million users in 2020. The board pounced, arguing that the company had not been “fully forthcoming.”

Related posts

Musk’s changes at Twitter helped spread Russian propaganda, EU study finds

newsconquest

Wisk and Archer Will Collaborate on Air Taxis and End Legal Fight

newsconquest

Google Will Start Deleting Old Accounts Tomorrow. Here’s How to Save Yours

newsconquest