“The Board is encouraged by first-year trends in its engagement with Meta, but the company must urgently improve its transparency,” the group said in the report. “The Board continues to have significant concerns, including around Meta’s transparency and provision of information related to certain cases and policy recommendations.”
Last year, Facebook renamed its parent company Meta. Meta spokesperson Dan Chaison said in a statement that the board’s decisions have “resulted in substantive changes to our content moderation policies and enforcement practices.”
“We know that we will always have room for improvement, and we are grateful for the board’s thoughtful input on the most significant and difficult questions we face,” Chaison said.
Facebook conceived the Oversight Board as an experiment, as regulators around the world were trying to craft uniform rules governing social media platforms. The company argued that the board could chart direction for content policy and be a model for other companies’ governance structures.
But critics have asked whether a board given no formal authority and serving at the pleasure of the company has enough power to force Facebook to follow its recommendations for issues plaguing its platform, including misinformation and hate speech. While the board has offered independent supervision of the company, it is dependent on Facebook to give it information, funding and the power to make change.
The annual report sheds light on some of the challenges facing the group as it makes critical decisions about how the company should support the free expression of its users while mitigating the harms of problematic speech. In public comments and case decisions, the board has repeatedly chastised Facebook for not giving the Oversight Board and users enough information to evaluate the company’s content moderation systems.
Since its creation, the board has ruled in an array of cases, including deciding that an Instagram user’s breast cancer awareness post violated the companies’ rules against nudity, and opining on whether Facebook should have suspended the account of then-President Donald Trump over his role in the Jan. 6 attack on the U.S. Capitol.
The board said Wednesday that of the 20 cases the company and users referred to it in 2021, it overturned Facebook 14 times and upheld six of its decisions.
In the case of Trump, the board affirmed Facebook’s decision to suspend the former president but told the company it must clarify its policies about the penalties for rule-breaking politicians and make the final decision on whether Trump could return to the platform. Facebook eventually decided to suspend Trump for two years, opening the door for him to return to the site before the 2024 presidential election.
Under the rules, Facebook and its users are allowed to appeal to the Oversight Board cases in which the company has taken down posts for violating its community standards — rules it imposes against hate speech, harassment and other problematic types of content. The decisions the Oversight Board makes on these cases are considered binding.
Separately, the Oversight Board can issue policy recommendations for changes to the company’s content moderation systems, but those are not considered binding.
Overall, Meta committed to at least partially implementing two-thirds of the board’s 86 policy recommendations, according to the report. For the remainder of the recommendations, Meta said it either already does the work suggested, would not take action or would assess the feasibility of implementing the board’s policy suggestion.
Among the most common recommendations, the board urged Facebook to give users more information about the rules they are breaking when their content is removed.
“Our recommendations have repeatedly urged Meta to follow some central tenets of transparency,” the board said in the report. “Make your rules easily accessible in your users’ languages; tell people as clearly as possible how you make and enforce your decisions; and, where people break your rules, tell them exactly what they’ve done wrong.”
Facebook is now tells English-speaking users when their content is removed for hate speech and is testing that policy for content in Arabic, Spanish and Portuguese, as well as for posts removed for bullying and harassment, the report said.
The Oversight Board also launched an implementation committee to evaluate whether the company actually is making the policy changes it says it will in response to the board’s policy recommendations, the board said.
Tension between the Oversight Board and Facebook flared last fall when the board chastised Facebook for its lack of transparency about a program meant to exempt famous people from facing penalties over posts that break the company’s content rules. At the time, citing Facebook internal documents, the Wall Street Journal had reported that while the company told the board that the program affects only a “small number of decisions,” it actually included at least 5.8 million users in 2020. The board pounced, arguing that the company had not been “fully forthcoming.”