Oversight board urges Meta to rethink its policy on manipulated media in high-stakes election year

FILE - People talk near a Meta sign outside of the company's headquarters in Menlo Park, Calif., March 7, 2023. Meta's Oversight Board said Monday, Feb. 5, 2024, that it is urging the company to clarify its approach to manipulated media so its platforms can better beat back the expected flood of online election disinformation this year. The recommendations come after the board reviewed an altered video of President Joe Biden that was misleading but didn't violate the company's policies. (AP Photo/Jeff Chiu)

NEW YORK (AP) 鈥 An oversight board is criticizing Facebook owner Meta's policies regarding manipulated media as 鈥渋ncoherent" and insufficient to address the flood of online disinformation that already has begun to target elections across the globe this year.

The quasi-independent board on Monday said its review of an altered video of President Joe Biden that spread on Facebook exposed gaps in the policy. The board said Meta should expand the policy to focus not only on videos generated with artificial intelligence, but on media regardless of how it was created. That includes fake audio recordings, which already have convincingly in the U.S. and elsewhere.

The company also should clarify the harms it is trying to prevent and should label images, videos and audio clips as manipulated instead of removing the posts altogether, the Meta Oversight Board said.

The board鈥檚 feedback reflects the intense scrutiny that is facing many tech companies for their handling of election falsehoods in a year when will go to the polls. As both generative artificial intelligence deepfakes and lower-quality 鈥渃heap fakes鈥 on social media threaten to mislead voters, the platforms are trying to catch up and respond to false posts while protecting users鈥 rights to free speech.

鈥淎s it stands, the policy makes little sense,鈥 Oversight Board co-chair Michael McConnell said of Meta鈥檚 policy in a statement on Monday. He said the company should close gaps in the policy while ensuring political speech is 鈥渦nwaveringly protected.鈥

Meta said it is reviewing the Oversight Board's guidance and will respond publicly to the recommendations within 60 days.

Spokesperson Corey Chambliss said while audio deepfakes aren't mentioned in the company's manipulated media policy, they are eligible to be fact-checked and will be labeled or down-ranked if fact-checkers rate them as false or altered. The company also takes action against any type of content if it violates Facebook's Community Standards, he said.

Facebook, which turned 20 this week, remains the most popular social media site for Americans to get their news, . But other social media sites, among them Meta鈥檚 Instagram, WhatsApp and Threads, as well as X, YouTube and TikTok, also are potential hubs where deceptive media can spread and fool voters.

Meta created its oversight board to serve as a referee for content on its platforms. Its current recommendations come after it reviewed an of President Biden and his adult granddaughter that was misleading but didn鈥檛 violate the company鈥檚 specific policies.

The original footage showed Biden placing an 鈥淚 Voted鈥 sticker high on his granddaughter鈥檚 chest, at her instruction, then kissing her on the cheek. The version that appeared on Facebook was altered to remove the important context, making it seem as if he touched her inappropriately.

The board鈥檚 ruling on Monday upheld Meta鈥檚 2023 decision to leave the seven-second clip up on Facebook, since it didn鈥檛 violate the company鈥檚 existing manipulated media policy. Meta's current policy says it will remove videos created using artificial intelligence tools that misrepresent someone's speech.

鈥淪ince the video in this post was not altered using AI and it shows President Biden doing something he did not do (not something he didn鈥檛 say), it does not violate the existing policy,鈥 the ruling read.

The board advised the company to update the policy and label similar videos as manipulated in the future. It argued that to protect users鈥 rights to freedom of expression, Meta should label content as manipulated rather than removing it from the platform if it doesn鈥檛 violate any other policies.

The board also noted that some forms of manipulated media are made for humor, parody or satire and should be protected. Instead of focusing on how a distorted image, video or audio clip was created, the company鈥檚 policy should focus on the harm manipulated posts can cause, such as disrupting the election process, the ruling said.

Meta said on its that it welcomes the Oversight Board鈥檚 ruling on the Biden post and will update the post after reviewing the board鈥檚 recommendations.

Meta is required to heed the Oversight Board鈥檚 rulings on specific content decisions, though it鈥檚 under no obligation to follow the board鈥檚 broader recommendations. Still, the board has gotten the company to make some changes over the years, including making messages to users who violate its policies more specific to explain to them what they did wrong.

Jen Golbeck, a professor in the University of Maryland's College of Information Studies, said Meta is big enough to be a leader in labeling manipulated content, but follow-through is just as important as changing policy.

鈥淲ill they implement those changes and then enforce them in the face of political pressure from the people who want to do bad things? That鈥檚 the real question,鈥 she said. 鈥淚f they do make those changes and don鈥檛 enforce them, it kind of further contributes to this destruction of trust that comes with misinformation.鈥

___

Associated Press technology writer Barbara Ortutay in San Francisco contributed to this report.

___

The Associated Press鈥痳eceives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP鈥檚 democracy initiative . The AP is solely responsible for all content.

The 春色直播 Press. All rights reserved.

More Science Stories

Sign Up to Newsletters

Get the latest from 春色直播News in your inbox. Select the emails you're interested in below.