Nick Abbot 10pm - 1am
Paedophiles using AI to ‘de-age’ celebrities, redistribute and create child abuse images
25 October 2023, 14:28 | Updated: 25 October 2023, 15:47
Paedophiles are creating sexual abuse images from exisitng photos with AI to create new material, resulting in around 3,000 photos being shared on a dark web forum.
Listen to this article
Around 3,000 AI-generated images of child abuse have been shared on the dark web since September.
564 of the images depicted the most severe type of imagery, including sexual torture, rape and bestiality.
1,372 images showed children, between the ages of seven and 10 years old.
In addition,143 of the images presented children aged three to six and two showed infants under two years old, according to the Internet Watch Foundation (IWF)
Child Sexual Abuse material is illegal under the Protection of Children Act 1978, criminalising the taking, distribution and possession of indecent or pseudo photographs of children.
Most of the images found on the dark web breached the Act.
Some images show real children with their faces and bodies and have been used to train the AI models.
The charity said trained analysts would find it difficult to see the difference between real and AI images.
The IWF warned that the text-to-image technology will only become more sophisticated.
Dan Sexton, the Chief Technical Officer at the IWF, told LBC that even though the images are AI-generated, it is not a "completely victimless crime".
He said: "There are real victims being used here.
"Not only do they have to worry about their imagery being distributed on the internet and viewed and shared but now their imagery is being used as a tool to create new imagery of abuse.
"So the whole cycle is problematic in so many places."
In other cases, paedophiles are using AI to "nudify" originally clothed children to upload them online.
Criminals are also using image-generating software to "de-age" celebrities and present them as children in sexual scenarios.
Ian Critchley, the National Police Chiefs' Council lead for child protection in the UK said the creation of these images normalises child abuse in the real world.
He said: "It is clear that this is no longer an emerging threat - it is here and now".
"We are seeing children groomed, we are seeing perpetrators make their own imagery to their own specifications, we are seeing the production of AI imagery for commercial gain - all of which normalises the rape and abuse of real children."
The UK’s upcoming Online Safety Bill is designed to hold social media platforms to account for the content published on their sites.
However, the Bill does not cover AI companies whose models are being used to create images showing abuse.
The UK government will host an AI safety summit on the first two days of November to address the risks AI brings and to consider what action needs to be taken.
A Home Office spokesperson said: "Online child sexual abuse is one of the key challenges of our age, and the rise in AI-generated child sexual abuse material is deeply concerning.
"We are working at pace with partners across the globe to tackle this issue, including the Internet Watch Foundation.
"Last month, the home secretary announced a joint commitment with the US government to work together to innovate and explore development of new solutions to fight the spread of this sickening imagery."