The manipulated video was posted in September last year [File]
| Photo Credit: REUTERS
The Meta Oversight Board overturned the social media company’s decision to leave up an AI-manipulated video post that showed Brazilian soccer player Ronaldo Nazário promoting an online game, which he did not.
The manipulated video was posted in September last year, which falsely showed Nazário endorsing the play-to-earn game. Interestingly, users were taken to a different game when they clicked the AI-manipulated video’s link.
A Meta user complained about this post, but their report was not addressed and the content remained visible until the case reached the oversight board.
While the “ad” was disabled for violating the company’s Unacceptable Business Practices Advertising Standard, the original organic post stayed up. Only later was it removed for policy violations.
“Meta has a responsibility to “mitigate adverse human rights impacts” of monetized content that could scam or defraud – in line with the United Nations Guiding Principles on Business and Human Rights. When paid to boost content, Meta should ensure these posts do not violate its policies,” noted the board in its decision.
The oversight board criticised Meta for lacking a uniform enforcement policy when it comes to celebrity endorsements. It also stressed Meta’s responsibilities to users and the public when labelling or taking action against AI deepfakes, to prevent public fraud from taking place.
“Based on public reporting, the Board notes Meta is likely allowing significant amounts of scam content on its platforms to avoid potentially overenforcing a small subset of genuine celebrity endorsements. At-scale reviewers are not empowered to enforce this prohibition on content that establishes a fake persona or pretends to be a famous person in order to scam or defraud. Meta should enforce this prohibition at-scale by providing reviewers with often easily identifiable indicators that distinguish AI content,” noted the Meta Oversight Board in its post.
Published – June 06, 2025 02:27 pm IST