Meta said the post did not violate its rules, which apply only to deepfakes — photos, videos and audio created by artificial intelligence to impersonate a person — that alter someone’s speech.
Meta’s Oversight Board, an independent collection of academics, experts and lawyers who oversee thorny content decisions on the platform, upheld the social media giant’s decision to leave the video in place. But it called on the company to clarify its policies, amid widespread concerns about the risks of artificial intelligence.
The decisions the Oversight Board, which is funded by Meta, makes on specific cases are considered binding, but its recommendations on policy changes are not.
“The volume of misleading content is rising, and the quality of tools to create it is rapidly increasing,” Oversight Board Co-Chair Michael McConnell said in a statement. “Platforms must keep pace with these changes, especially in light of global elections during which certain actors seek to mislead the public.”
Meta spokesperson Corey Chambliss said the company was reviewing the guidance.
The rebuke comes as experts warn that AI-generated misinformation is already spreading online, potentially confusing scores of voters during a pivotal election year.
The video, which was posted on Facebook in May 2023, uses real footage of Biden when he voted in the 2022 midterm election along with his granddaughter, then a first-time voter, according to the Oversight Board.
The video “loops” a moment when Biden placed an “I Voted” sticker on his adult granddaughter’s chest, with the poster suggesting in a caption that the contact was inappropriate.
Because the video doesn’t alter Biden’s speech, the Oversight Board agreed it didn’t violate Meta’s rules. The board also said it was obvious the video had been edited.
But the video raises issues with Meta’s existing policies, which the Oversight Board said were focused on how content is created, rather than its potential harms — including voter suppression. It called on Meta to extend its manipulated media policy to address altered audio as well as videos that show people doing things they didn’t do.
The Oversight Board also recommended that the company not remove manipulated media if it doesn’t violate any other rules but attach a label alerting users that the content has been altered.