- Are you one of the 250 people affected? Email rory.tingle@mailonline.co.uk
More than 250 British celebrities, including Channel 4 newsreader Cathy Newman, have been found to be victims of deepfake porn, an investigation has found.
The investigation, carried out by Channel 4 itself, found that a number of digitally altered images of celebrities had their faces superimposed onto pornographic videos using artificial intelligence (AI).
The broadcaster analyzed the five most-visited deepfake websites and found that of the nearly 4,000 celebrities on the list, 255 were British, and all but two were women. announced that it had been found.
In the report, the 49-year-old presenter said he felt violated after watching the footage, but the investigation also found that most victims of this type of crime are not celebrities.
This comes months after a deepfake image of pop star Taylor Swift was posted on X (formerly Twitter), with experts warning of the dangers the technology poses in spreading misinformation. are doing.
“This feels like a violation,” Newman said in a report aired Thursday after seeing the deepfake footage of herself.
“It feels really sinister that there's someone who put this together and I can't see them and they can see this imaginary version of me, this fake version of me. Masu.
“I can't stop seeing it. It's something I keep coming back to.
“And just the idea that thousands of women have been manipulated in this way feels like an absolutely terrible intrusion and violation.
“The fact that something like this can be found at the click of a button and that people can so easily create this grotesque parody of reality is truly alarming.”
Channel 4 News said it contacted more than 40 celebrities for its investigation, but all were reluctant to comment publicly.
The broadcaster also said it found that more than 70 per cent of visitors arrived at deepfake websites using search engines such as Google.
Advances in AI have made it easier to create digitally altered fake images.
Industry experts have warned of the dangers posed by AI-generated deepfakes and their potential to spread misinformation, especially in a year when many countries, including the UK and US, will hold large-scale elections. .
Earlier this year, a deepfake image of pop star Taylor Swift was posted on X (formerly Twitter), and the platform blocked searches related to the singer after fans urged the Elon Musk-owned platform to take action. did.
The Online Safety Act makes it a crime to share or threaten to share fabricated or deepfake intimate images or videos of another person without their consent, but the creation of such deepfake content is not intended to be a crime.
Channel 4 News claimed in its research that women who are not in the public eye are the most targeted by deepfake porn.
Mr. Newman spoke to Sophie Parrish. Parrish started her petition before the law was changed, after the person who created her digitally altered pornography was detained by police but faced no further legal action.
She told the PA news agency in January that she had received a Facebook message from an unknown user. The messages included videos of the man masturbating on top of her and using her shoes to pleasure himself.
“I felt and still feel very dirty. That's one of the only ways I can describe it. And I'm very embarrassed by the fact that the images are there. ” she said.
Conservative MP Caroline Nokes, chair of the Women and Equalities Committee, told Channel 4 News: “It's horrifying… this means women are being targeted.”
“We need to protect people from these kinds of deepfake images that can destroy lives.”
In a statement to the news channel, a Google spokesperson said: “We understand how distressing this content is and are committed to strengthening existing protections to help those affected. “
“Our policy allows you to remove pages that feature this content and include your likeness from search.
“And while this is a technical challenge for search engines, we are actively developing additional safeguards around Google Search, including ways to help people protect themselves at scale. This includes tools and ranking improvements to address this content more broadly.”
“Meta strictly prohibits services that provide child nudity, child sexual content, or non-consensual AI-generated nude images,” Meta's Ryan Daniels said in a statement to the station. said.
Activist Elena Michael, from the NotYourPorn group, told Channel 4 News: “Platforms profit from this kind of content.”
“Not just porn companies, not just deepfake porn companies, but social media sites as well. Push traffic to your site. Advertising becomes more effective.”