
James O'Brien 10am - 1pm
29 May 2025, 10:55 | Updated: 29 May 2025, 11:18
A 'face-swapping' app which encouraged users to put children's faces onto highly sexualised adult content has been pulled from download after LBC exposed its "dangerous" promotions.
The App, which LBC has decided not to name, but which has been dowloaded more than a million times, has been totally removed from the Apple Store. The app is still available on Google’s store but they have launched an investigation.
Meanwhile, adverts for the app have been removed from TikTok for violating advertising policies.
Ads for the app which led to it being pulled have been branded "horrifying and dangerous".
The clips - which experts are warning show how "offenders end up with the tech in their pocket to create dangerous threats" - were discovered in an LBC investigation, prompting Google to launch their own formal inquiries and Government calls for Ofcom intervention.
Viewed by tens of thousands of users since being pushed out as paid ads on TikTok this month, they promote an application which lets users project faces on to different bodies with what it describes as "stunningly realistic" Artificial Intelligence software.
We've taken the decision not to name it due its advertised abilities but statistics show it's already been downloaded on more than one million occasions.
One advert seen by LBC shows a woman in yoga pants and a sports bra gyrating on a stool while filming herself in a mirror, with a circular insert of what appears to be a young child on the screen. It then shows the faces being swapped and the video continuing with the same actions.
Another shows another woman in cycling shorts and a sports bra bouncing in front of a mirror on an exercise ball, with a circular insert of what again appears to be a young child also on the screen.
It then again shows the faces being swapped and the video continuing with the same actions.
Both clips then finish with a title appearing on the screen urging viewers to download the app. Attempting to press the profile icon on both clips confirms they are being run as paid for promotions, with the message 'Video ads do no support this feature' appearing on screen.
Digital minister, Sir Chris Bryant, told LBC he'd urge the regulator Ofcom, to look into it.
Megan Hinton from the Marie Collins Foundation - which supports victims and survivors of technology-assisted child sexual abuse - condemned the ads and questioned TikTok's platforming of them: "We're dedicated to supporting children and families who have experienced this type of harm.
And I can absolutely say that the impact is often lifelong and it's devastating. It can have severe impacts on mental health, on self esteem and it really is just an extreme violation of self to have your face or parts of your body manipulated in this way, especially on sexualized content.
"The fact that this is actually being advertised and pushed to users is particularly horrifying. I think we also stray into a bit of a danger area here where if you're encouraging people to put children's faces on sexualised content of adults, we're actually then looking at encouraging offenders to create pseudo images of child sexual abuse material.
"I'm horrified that technology companies think it is okay and allow for apps like this to be advertised on their social media platforms (in this way). They need to start prioritising safety over profit and take these adverts down."
Hannah Swirsky from the Internet Watch Foundation - which works to find and remove child sexual abuse material from the internet - added: "Technology is developing so quickly and it is really important for companies to think about how criminals could abuse new tech before they make it available.
“We see synthetic child sexual abuse imagery, sometimes involving real victims and survivors, which can be indistinguishable from photographic child sexual abuse imagery.
"That offenders can have the tech to do this available on an app on the phone in their pocket creates a dangerous threat. We know offenders are disproportionately targeting girls, and that this sort of abuse has a real and lasting effect on victims."
Both charities have also reported they're coming across a rising number of cases like this, echoing the findings of another LBC investigation which revealed there's been a four-fold increase in the number of AI-generated child sexual abuse images discovered online over the last twelve months.
We also covered last week's Culture, Media and Sport Committee session where Ofcom's Chief Executive Dame Melanie Dawes revealed the regulator had launched four 'enforcement investigations' into online platforms including a 'nudifying app' with similar abilities to this app.
And reacting to today's report, Minister for Creative Industries, Arts and Tourism, Sir Chris Bryant, called for the regulator to look into it too, telling LBC: "I've not seen it, and I'm always slightly cautious about commenting on things that I haven't seen, but from the way you've described it, it sounds absolutely appalling.
"And I'm absolutely certain that if it is as you described, that Ofcom would want to look at it.
"We've introduced new rules which mean that all apps and all platforms have to make sure that all the systems that they have within their apps and their platforms are able to protect young people.
"Of course, there's obviously also legislation around criminal offences, which some of this may touch on, and yes, I would urge Ofcom to look at this."
In response, an Ofcom spokesperson suggested this may fall outside of their investigatory remit but said: "Platforms regulated under the Act that fail to introduce appropriate measures to protect UK users – particularly children – should expect to face enforcement action."
Following our findings, a spokesperson for TikTok told LBC they have now removed the highlighted ads from their site for violating advertising policies.
A spokesperson for Google also confirmed it was investigating this matter, telling LBC it does not allow apps that engage in or benefit from promotional practices that are harmful, including through the use of sexually explicit ads to direct users to the app’s Google Play listing for download.
Trying to track down who formally owns the app proves difficult and leads to a series of broken websites but through information buried in privacy policies and terms and conditions documents it seems to be linked to a company headquartered in China who didn't respond to LBC's request for a comment officially.
However we did obtain a response from a generic 'support' email listed for the app on both app stores which suggested either its murky structure or the potential use of AI in creating the ads themselves mean at least one party involved isn't even aware that they've been running.
The response said: "Thank you very much for bringing this matter to our attention.
"We take these concerns extremely seriously and have a zero-tolerance policy toward any form of content that could be interpreted as sexualising children or enabling technology-assisted abuse. We are currently reviewing the specific ads you referenced and have initiated internal checks based on the details you have provided.
"We are also working closely with relevant team members to ensure that any such inappropriate content is identified and removed immediately. We deeply appreciate your diligence in highlighting this issue and guiding us to ensure that our platform and promotions remain responsible and safe for all audiences.
"Should you have any further questions or evidence to share, we are more than willing to cooperate fully. Thank you again for your outreach and for helping hold technology providers to high ethical standards."
Since being a made aware of the adverts by LBC, Apple has also removed the app in question from their App store.
A Government spokesperson said:“Using social media for these disturbing purposes is vile. This government is taking robust action to ensure platforms take action to protect users from illegal material.
"As well as introducing new offences to crackdown on the creation and distribution of child abuse images online. “Like parents across the country we expect to see these laws help create a safer online world.
"But we won’t hesitate to go further to protect our children; they are the foundation not the limit when it comes to children’s safety online.”