AI-generated fake nude photos on social media are becoming a problem. Two cases involving fake photos of celebrities are currently coming under scrutiny. One shows naked celebrities and the other is pornographic photos of women and men. People who posted photos on Meta's Facebook and Instagram platforms complained to Meta's supervisory board that their photos were removed.
The commission said it selected these cases to assess whether Meta's policies and enforcement practices are effective in addressing explicit AI-generated images.
Mehta removed both posts for violating her bullying and harassment policy, which prohibits “derogatory sexual photoshops or drawings.” Meta also said some of the content violated its adult nudity and sexual activity policy.
The oversight committee decided to treat both cases together. In each case, the board will decide whether to allow the content on Instagram or Facebook.
The first case concerns an AI-generated image of a nude woman posted on Instagram. This image was created in the likeness of an Indian celebrity using her AI. The account that posted this content only shares AI-generated images of Indian women.
“The majority of users who responded had accounts in India, where deepfakes are increasingly becoming an issue,” the watchdog said.
In this case, a user reported the content to Meta as pornographic. As a result of the Board's selection of the matter, Meta determined that its previous decision to continue publishing the content was a mistake and removed the post.
The second case concerns an image posted to a Facebook group as an AI work. This is an AI-generated image of a nude woman having her breasts groped by a man. This image was created using AI to resemble the American celebrity whose name is also mentioned in the caption. The majority of users who responded have accounts in the United States.
“In this case, another user had already posted this image, so it was escalated to Meta's policy or subject matter experts, who made the decision to remove the content as a violation of our bullying and harassment policy. Specifically: “Derogatory sexual photoshops or drawings,” the Oversight Board explains.
This image was added to the Media Matching Service Bank, which is part of Meta's auto-enforcement system. The system automatically finds and removes images already identified by human reviewers as violating Meta's rules. Therefore, in this case, the image was already considered to violate Facebook's Community Standards and was removed.
Moonshot News is an independent European news website aimed at all IT, media and advertising professionals, run by women and telling stories about diversity, inclusion and gender equality in the industry. We focus on promotion.
Our mission is to provide the best and unbiased information for all professionals, and to empower women to have a fair voice in the news and in the spotlight.
We produce original content, news articles, a calendar of curated industry events, and a database of women in IT, media and advertising associations.