Criminalising deepfakes 'is not enough', claims internet safety group amid calls for total ban on 'nudifying apps'

10 March 2025, 08:27 | Updated: 12 March 2025, 08:11

Online Safety Day is today, 10th March from 7am to midnight on LBC, available across the UK on Global Player on your smart speaker, iOS or Android device; on DAB digital radio and TV, at LBC.co.uk and in London on 97.3 FM.

An internet safety charity has called on the government to ban 'nudifying apps' that are used to harm women and girls.
An internet safety charity has called on the government to ban 'nudifying apps' that are used to harm women and girls. Picture: Getty

By Katy Ronkin

An internet safety group has called on the government to ban 'nudifying apps' that are used to harm women and girls.

Listen to this article

Loading audio...

Internet Matters, a not-for-profit promoting children's safety online, is calling on the government to ban image generation tools - used to create non-consensual explicit images of women and girls.

These 'Nudifying apps' are AI image generators that manipulate photos by removing clothing and generating a hyper-realistic image of an individual without their consent.

Campaigners say the apps are commonly used to create child sexual abuse material.

Home Secretary Yvette Cooper announced in March that the creation of non-consensual images and downloading of these tools will become a criminal act under the new Crime and Policing Bill.

Read more: 'Nudifying apps' are a growing threat to women and girls. Will the Online Safety Act be enough?

Read more: The ultimate guide to keeping your children safe online by LBC's tech guru Will Guyatt

Cute little girl using smartphone at home, looking at screen before sleep time
Cute little girl using smartphone at home, looking at screen before sleep time. Picture: Alamy

Katie Freeman-Tayler, head of research and policy at Internet Matters, told LBC that while she welcomes Ms Cooper's announcement, the government should go further, banning the apps themselves.

She told LBC: "We still are calling for a complete ban on these tools existing. Our rationale for that is they create an environment where it is OK to harass and abuse women and girls. We know there are reports of the tools being used to facilitate child sexual abuse material.

"They are being used to kind of create an accepted culture of harassment and abuse of women. We don't think that there is any reason or need for anyone to be able to create a nude image of someone.

"The government should go one step further. We're supportive of all of the various pieces of legislation that they've announced but the fact that they exist will mean that these images will continue to be created, so why not make them illegal to begin with?"

Research from the group shows that an estimated half a million children in the UK (13%) have experienced a nude deepfake online and over the last year deepfake sexual content has increased 400%.

Creating deepfakes can also pose legal risks for children as they might not realise that generating these images could be considered as creating child sexual abuse material, warned Katie Mehew, an associate at legal firm Mishcon de Reya who works on similar cases.

LBC’s Henry Riley: what does TikTok show our kids? | Online Safety Day

Ms Freeman-Tayler also offered advice for teachers and parents who are concerned about the rise of these apps.

"Children are often new adopters of technology, so parents and teachers often feel on the back foot," she said.

"Sometimes a child might be talking about something that the teacher hasn't ever heard of, and then of course, that's a challenge for both teachers and parents."

Ms Freeman-Tayler suggests that parents and teachers can focus on broader conversations around intimate images and misogyny to begin a discussion, but ultimately parents know their children best.

"You know your child best, so if you feel that this conversation is needed or appropriate or if there is a reason for it, we provide support for that on our website as well."