Paedophiles using AI to make new images of real victims of child sexual abuse that threaten to 'overwhelm the internet'

25 October 2023, 09:13

Paedophiles are using AI to create new images of real abuse-victim children
Paedophiles are using AI to create new images of real abuse-victim children. Picture: Alamy

By Kit Heren

Paedophiles are using artificial intelligence to create new images of children who have previously been sexually abused, an internet watchdog has claimed.

Listen to this article

Loading audio...

The criminals are using the faces and bodies of real abuse victims to make new scenarios "because someone, somewhere, wants to see it," the Internet Watch Foundation (IWF) said.

The organisation said that they feared that a deluge of AI-generated abuse images could overwhelm police and make it harder to safeguard children. "Our worst nightmares have come true", they added.

Researchers said they found nearly 3,000 AI-generated abuse images that would be illegal under UK law in a single forum. They discovered over 500 images of a girl who was sexually abused between the ages of 9 and 10, along with an AI model that would allow criminals to generate more.

Meanwhile paedophiles are also using AI to create images of celebrities as children, the IWF said - and manipulating genuine photos of child actors to make them look like sexual abuse.

Read more: Convicted paedophile given custody of young girl and gets her pregnant - as authorities believe 'he poses low risk to kids'

Read more: Paedophile who abducted schoolgirl while dressed as a woman jailed for 20 years

Sajid Javid says end to end encryption makes WhatsApp a 'playground for paedophiles'

The IWF released the information in a report ahead of the government's AI summit next week, in the hope of getting the subject onto the agenda.

Last month, Home Secretary Suella Braverman said the British and US governments had committed to "exploring further joint action to tackle the alarming rise in despicable AI-generated images of children being sexually exploited by paedophiles."

Susie Hargreaves, chief executive of the IWF, said: "Our worst nightmares have come true. Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers.

"We have now passed that point.

"Chillingly, we are seeing criminals deliberately training their AI on real victims' images who have already suffered abuse.

Tom Swarbrick: Eight years in prison not enough for paedophiles

"Children who have been raped in the past are now being incorporated into new scenarios because someone, somewhere, wants to see it.

"As if it is not enough for victims to know their abuse may be being shared in some dark corner of the internet, now they risk being confronted with new images, of themselves being abused in new and horrendous ways not previously imagined.

"This is not a hypothetical situation. We're seeing this happening now. We're seeing the numbers rise, and we have seen the sophistication and realism of this imagery reach new levels.

"International collaboration is vital. It is an urgent problem which needs action now. If we don't get a grip on this threat, this material threatens to overwhelm the internet."