Hundreds of British female actors, TV stars, musicians, YouTubers and journalists have been victims of deepfake pornography, a Channel 4 News investigation to be broadcast tonight has found.
At least 250 British celebrities appear in the deepfake videos, which use artificial intelligence to superimpose their faces onto porn.
Channel 4 News is not publishing the names of those affected. Channel 4 News said he contacted more than 40 celebrities, but all were reluctant to comment publicly.
Channel 4 newscaster Cathy Newman was one of the victims. Kathy responded in her report about her own video: It feels really creepy that there are people who put this together and I can't see them and they can see this imaginary version of me, this fake version of me. . ”
“I can't stop seeing it. It's something I keep coming back to. And just the idea that thousands of women have been manipulated like this. It's a completely gross intrusion and violation. I feel like.
“It's really alarming that something like this can be found at the click of a button and that people can so easily create this grotesque parody of reality.”
The growth of deepfake porn has skyrocketed, driven in part by advances in AI technology and easily accessible apps available online.
In 2016, researchers identified just one deepfake porn video online. In the first three quarters of 2023 alone, 143,733 new deepfake porn videos were uploaded online. This is more than all previous years combined. That means there are millions of victims around the world.
The video has received a lot of views. An independent analysis shared with Channel 4 News found that he had a total of 4.2 billion views across the 40 most visited deepfake porn sites.
A Channel 4 News analysis of the most visited deepfake websites found that around 4,000 celebrities were on the list, 255 of whom were British.
More than 70% of visitors to the top five deepfake porn sites came from search engines like Google.
Earlier this year, an explicit deepfake image of Taylor Swift was posted on X (formerly Twitter). These were viewed approximately 45 million times before being removed by the platform.
Facebook and Instagram also reportedly ran ads showing blurred deepfake sexual images of actress Jenny Ortega when she was just 16 years old, after which Meta The image was deleted.
Campaigner Elena Michael from the NotYourPorn group told Channel 4 News: “Platforms are profiting from this kind of content. Not just porn companies, not just deepfake porn companies, but social media sites as well. It pushes traffic to the site. It makes advertising more effective.”
“There are many different ways to engage users on your platform, and they might play a different role or buy something on your site, but they still don't want to be involved in that role, whatever it is. social media sites that profit from it should not be tolerated. ”
Despite the proliferation of deepfake videos targeting celebrities, the most targeted women are individuals. An independent investigation shared by Channel 4 News found that hundreds of thousands of images and videos of unknown people were posted on her 10 websites in 2023.
Most image creation is done using apps, and the number of so-called “undressing” or “nude” apps has mushroomed to more than 200.
Channel 4 News spoke to Sophie Parrish, a 31-year-old mother of two from Merseyside. She discovered that deepfake nude images of her had been posted online. She described the impact they had on her life and her family:
“It's so violent, so degrading. It's like women don't mean anything, we're worthless, we're just pieces of meat. Men can do whatever they want. It's okay. Before this I trusted everyone. My walls were always broken, but now I don't trust anyone.
“My oldest son will say, 'What did that mean guy do to make mommy angry?'” Will you tell me someday? Because over night, he watched his mother change from a happy person to a complete shell of the person she once was, crying almost every day and getting angry easily. ”
From January 31 this year, sharing deepfake images without consent will be illegal under the Online Safety Act, but creating the content is not. An individual commits a crime when they share deepfake porn without their consent.
Online safety regulation is in the hands of broadcasting watchdog Ofcom, but there is still consultation on how the new law will be implemented and applied.
Activists and legal experts who spoke to Channel 4 News criticized the watchdog's draft guidelines as weak because they do not put pressure on major technology platforms that facilitate the hosting and dissemination of deepfake porn.
A spokesperson for Ofcom told Channel 4 News: “Illegal deepfake material is deeply disturbing and damaging. Under the Online Safety Act, businesses must ensure that such content is not distributed on their services. They need to assess the risk of displaying these displays, take steps to prevent them from appearing, and take action to remove them as soon as they become aware of them. We encourage you to take action now to protect your users.”
Google, Meta, and X declined to be interviewed. A Google spokesperson told Channel 4 News in a statement: “We understand how distressing this content is and are committed to strengthening existing protections to help those affected.
“Our policy is to allow users to feature this content and remove pages that include their likeness from search. Although this is a technical challenge for search engines, we are We are actively developing additional safeguards, including tools to help people protect themselves at scale and improvements to our rankings to address this content broadly.”
Ryan Daniels of Meta said: “Meta strictly prohibits services that provide child nudity, child sexualization content, and non-consensual AI-generated nude images. The app remains widely available in various app stores. However, these ads and the accounts behind them have been removed.”