Maliciously edited Joe Biden video can stay on Facebook, Meta's Oversight Board says

However, the board said Meta should update its "incoherent" manipulated media policy.

TASOS KATOPODIS/AFP via Getty Images

The Oversight Board is urging Meta to update its manipulated media policy, calling the current rules “incoherent.” The admonishment comes in a closely watched decision about a misleadingly edited video of President Joe Biden.

The board ultimately sided with Meta regarding its decision to not remove the clip at the center of the case. The video featured footage from October 2022, when the president accompanied his granddaughter who was voting in person for the first time. News footage shows that after voting, he placed an “I voted” sticker on her shirt. A Facebook user later shared an edited version that looped the moment so it appeared as if he repeatedly touched her chest. The caption accompanying the clip called him a “sick pedophile,” and said those who voted for him were “mentally unwell.”

In its decision, the Oversight Board said that the video was not a violation of Meta’s narrowly-written manipulated media policy because it was not edited with AI tools, and because the edits were “obvious and therefore unlikely to mislead” most users. “Nevertheless, the Board is concerned about the Manipulated media policy in its current form, finding it to be incoherent, lacking in persuasive justification and inappropriately focused on how content has been created rather than on which specific harms it aims to prevent (for example, to electoral processes),” the board wrote. “Meta should “reconsider this policy quickly , given the number of elections in 2024.”

The company’s current rules only apply to videos that are edited with AI, but don’t cover other types of editing that could be misleading. In its policy recommendations to Meta, the Oversight Board says it should write new rules that cover audio and video content. The policy should apply not just to misleading speech but “content showing people doing things they did not do.” The board says these rules should apply “regardless of the method of creation.” Furthermore, the board recommends that Meta should no longer remove posts with manipulated media if the content itself isn't breaking any other rules. Instead, the board suggests Meta “apply a label indicating the content is significantly layered and may mislead.”

The recommendations underscore mounting concern among researchers and civil society groups about how the surge in AI tools could enable a new wave of viral election misinformation. In a statement, a Meta spokesperson said the company is “reviewing the Oversight Board’s guidance and will respond publicly” within the next 60 days. While that response would come well before the 2024 presidential election, it’s unclear when, or if, any policy changes may come. The Oversight Board writes in its decision that Meta representatives indicated the company “plans to update the Manipulated Media policy to respond to the evolution of new and increasingly realistic AI.”