Fake robocalls. Doctored videos. Why Facebook is being urged to fix its election problem.

As the nation heads into the 2024 presidential election, the independent body that reviews Meta’s content moderation decisions is urging the tech giant to overhaul its policy on manipulated videos to encompass fake or distorted clips that can mislead voters and tamper with elections.

The test case was a doctored video of President Joe Biden that appeared on Facebook last May.

Meta bans video clips that have been digitally created or altered with generative artificial intelligence to make it appear as if people have said something they did not. But it doesn't address cruder clips ? so-called "cheap fakes ? made with basic editing tools, nor does it address clips that show someone doing something they did not.

The Oversight Board upheld Meta's decision to allow the Biden video to remain on Facebook but called on Meta to crack down on all doctored content, regardless of how it was created or altered. It also recommended that Meta clearly define the aim of its policy to encompass election interference.

Of particular concern is faked audio, which the board said is “one of the most potent forms of electoral disinformation we’re seeing around the world.”

In January, a fake robocall used Biden's voice to encourage New Hampshire voters to skip the primary. The robocall was artificially generated and is being probed by the New Hampshire Attorney General's Office as an attempt at voter suppression. It did not affect the outcome of the primary – Biden won in a landslide – but it illustrated how generative AI could be used to influence an election, critics say.

“As it stands, the policy makes little sense,” Oversight Board Co-Chair Michael McConnell said in a statement. “It bans altered videos that show people saying things they do not say but does not prohibit posts depicting an individual doing something they did not do. It only applies to video created through AI, but lets other fake content off the hook.”

Meta did not say whether it would follow the Oversight Board’s guidance. A spokesman said the company was reviewing the recommendations and would respond publicly within 60 days.

Even if Meta makes changes to its manipulated media policy, observers say there's no guarantee it will put enough money and resources into enforcing the changes.

“The volume of misleading content is rising, and the quality of tools to create it is rapidly increasing,” McConnell said. “Platforms must keep pace with these changes, especially in light of global elections during which certain actors seek to mislead the public.”

The White House urged companies to step up their efforts to combat manipulated media, a Biden administration official said Tuesday.