More than 250 UK celebrities are victims of deepfake pornography, probe finds

21 March 2024, 20:54

Women of The Year Lunch and Awards 2019 – London
Women of The Year Lunch and Awards 2019 – London. Picture: PA

Channel 4 news presenter Cathy Newman watched footage of her own image superimposed on to pornography and said it felt like a violation.

More than 250 British celebrities have been victims of deepfake pornography, according to an investigation by Channel 4 News.

Among them is the channel’s news presenter, Cathy Newman, who said she felt violated on watching digitally altered footage in which her face was superimposed on to pornography using artificial intelligence (AI).

The broadcaster, which aired its investigation on Thursday evening, said it did an analysis of the five most visited deepfake websites and found 255 of the almost 4,000 famous individuals listed were British, with all but two being women.

In her report, Newman watched the deepfake footage of herself and said: “It feels like a violation.

“It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.

“You can’t unsee that. That’s something that I’ll keep returning to.

“And just the idea that thousands of women have been manipulated in this way, it feels like an absolutely gross intrusion and violation.

“It’s really disturbing that you can, at a click of a button, find this stuff, and people can make this grotesque parody of reality with absolute ease.”

Women of The Year Lunch and Awards 2019 – London
Cathy Newman watched digitally altered videos of herself after Channel 4 News did an investigation into deepfake pornography (Ian West/PA)

Channel 4 News said it contacted more than 40 celebrities for the investigation, all of whom were unwilling to comment publicly.

The broadcaster also said it found that more than 70% of visitors arrived at deepfake websites using search engines like Google.

Advances in AI have made it easier to create digitally altered and fake images.

Industry experts have warned of the danger posed by AI-generated deepfakes and their potential to spread misinformation, particularly in a year that will see major elections in many countries, including the UK and the US.

Earlier in the year, deepfake images of pop star Taylor Swift were posted to X, formerly Twitter, and the platform blocked searches linked to the singer after fans lobbied the Elon Musk-owned platform to take action.

MTV Video Music Awards 2023 – Arrivals – New Jersey
There were deepfakes of Taylor Swift on X in January (Doug Peters/PA)

The Online Safety Act makes it a criminal offence to share, or threaten to share, a manufactured or deepfake intimate image or video of another person without his or her consent but it is not intended to criminalise the creation of such deepfake content.

In its investigation, Channel 4 News claimed the most targeted individuals of deepfake pornography are women who are not in the public eye.

Newman spoke to Sophie Parrish, who started a petition before the law was changed, after the person who created digitally altered pornography of her was detained by police but did not face any further legal action.

She told the PA news agency in January that she was sent Facebook messages from an unknown user, which included a video of a man masturbating over her and using a shoe to pleasure himself.

“I felt very, I still do, dirty – that’s one of the only ways I can describe it – and I’m very ashamed of the fact that the images are out there,” she said.

Tory MP Caroline Nokes, who is chairwoman of the Women And Equalities Committee, told Channel 4 News: “It’s horrific… this is women being targeted.

“We need to be protecting people from this sort of deepfake imagery that can destroy lives.”

In a statement to the news channel, a Google spokesperson said: “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected.

“Under our policies, people can have pages that feature this content and include their likeness removed from Search.

“And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search – including tools to help people protect themselves at scale, along with ranking improvements to address this content broadly.”

Ryan Daniels, from Meta, said in a statement to the broadcaster: “Meta strictly prohibits child nudity, content that sexualises children, and services offering AI-generated non-consensual nude images.”

Elena Michael, a campaigner from the group NotYourPorn, told Channel 4 News: “Platforms are profiting off this kind of content.

“And not just porn companies, not just deepfake porn companies, social media sites as well. It pushes traffic to their site. It boosts advertising.”

By Press Association

More Technology News

See more More Technology News

Person on laptop

UK cybersecurity firm Darktrace to be bought by US private equity firm

Mint Butterfield is missing in the Tenerd

Billionaire heiress, 16, disappears in San Francisco neighbourhood known for drugs and crime

A woman’s hand presses a key of a laptop keyboard

Competition watchdog seeks views on big tech AI partnerships

A woman's hands on a laptop keyboard

UK-based cybersecurity firm Egress to be acquired by US giant KnowBe4

TikTok�s campaign

What next for TikTok as US ban moves step closer?

A laptop user with their hood up

Deepfakes a major concern for general election, say IT professionals

A woman using a mobile phone

Which? urges banks to address online security ‘loopholes’

Child online safety report

Tech giants agree to child safety principles around generative AI

Holyrood exterior

MSPs to receive cyber security training

Online child abuse

Children as young as three ‘coerced into sexual abuse acts online’

Big tech firms and financial data

Financial regulator to take closer look at tech firms and data sharing

Woman working on laptop

Pilot scheme to give AI regulation advice to businesses

Vehicles on the M4 smart motorway

Smart motorway safety systems frequently fail, investigation finds

National Cyber Security Centre launch

National Cyber Security Centre names Richard Horne as new chief executive

The lights on the front panel of a broadband internet router, London.

Virgin Media remains most complained about broadband and landline provider

A person using a laptop

£14,000 being lost to investment scams on average, says Barclays