
Natasha Devon 6pm - 9pm
10 March 2025, 11:48 | Updated: 10 March 2025, 15:25
Online Safety Day is today, Monday 10th March from 7am to midnight on LBC, available across the UK on Global Player on your smart speaker, iOS or Android device; on DAB digital radio and TV, at LBC.co.uk and in London on 97.3 FM.
Half of the online child sexual abuse images were distributed or taken on Snapchat in the UK last year.
50% of online child sexual abuse images took place on Snapchat's private messaging service last year, amid calls for the platform to "take responsibility".
Private messaging services are allegedly "safe havens" for perpetrators, as they can easily speak to children over the service that deletes your messages after 24 hours.
"We've been meeting tech platforms off the hook," Lewis Keller, Senior Policy Officer for Child Safety Online at the National Society for the Prevention of Cruelty to Children (NSPCC) said.
"We’re urging the tech platforms to take responsibility for this illegal content that’s being shared, which is totally unacceptable, and is endangering children."
This comes as the NSPCC wrote a letter to the Home Secretary to express concern regarding the coming Online Safety Act, which they believe doesn't go far enough to protect children from the worst forms of abuse on private messaging services.
Aggie Chambre explains the Online Safety Act
The child protection charity found that more than 100 child sexual abuse image crimes are reported by police every day, according to Home Office Data.
A separate Freedom of Information request submitted by the NSPCC showed that of the 7,338 offences where law enforcement recorded the platform used by the perpetrators, half took place on Snapchat, and a quarter on Meta products Instagram, Facebook and WhatsApp.
Keller said companies like Snapchat should start by removing perpetrators that add a number of accounts known to be children.
According to the him, the platform should also remove functions that make it easy to talk to children, like the 'quick add' function, which shows you a list of mutual friends, and the disappearing messages.
"Sharing in private messaging spaces are often safe havens for perpetrators.
"Private messaging services have been created without children’s safety in mind," Keller said.
Read More: The ultimate guide to keeping your children safe online by LBC's tech guru Will Guyatt
Read More: Family of murdered Bristol teenager slams YouTube and Snapchat over ‘torture’ of online rap videos
The NSPCC campaigned for the Online Safety Act to be introduced, but they said the new rules have a loophole.
The senior policy officer explained that companies, like Snapchat, won't have to remove child sexual abuse material from private messaging platforms if it's not "technically feasible" to do so.
Keller admitted that it's a "major step forward to ensuring children remain safe on the internet", but it must be stricter in order to protect children.
He said if this material has been shared, and it's been reported, and the platform becomes aware of it, they should be able to remove that content.
"We know that child sex abuse materials being shared and online grooming has been on the rise, and that’s why it’s so important that regulation keeps pace with changes in technology," he said.
He added that the new Online Safety Act was "essential to move the onus from users and children on to the tech platforms, who will now have to conduct risk assessments and change their platforms so children’s safety is taken into account".
Banning children from these apps would not be right, Keller argued.
"Children should have the opportunity to enjoy online spaces in a safe way," he said.
If children were banned from these platforms, they could join under an adults account and not be treated as a child user with the appropriate protections in place.
Keller argued that the tightening of restrictions on public social media platforms will push children to private messaging platforms - and make them more vulnerable to this type of abuse.
It's of "vital importance" that the new Online Safety Act enforces restrictions on private messaging spaces.
"We’re calling on the government to set out how they plan keeping children safe in private messaging services," he said.
This comes as polls find two-thirds of people in the UK back jail time for tech bosses who fail to protect children online.
A Snapchat spokesperson said: "The sexual exploitation of any member of our community is illegal and horrific.
"If we are made aware of such content, whether through our proactive detection efforts or confidential in-app reporting, we remove it, lock the violating account, and report to authorities.
"Snapchat is designed to make it difficult for predators to find and interact with young people and has extra safeguards in place to help prevent strangers from connecting with teens. Our Family Centre also allows parents to see who their teens are friends with and talking to on Snapchat.
"We work with expert NGOs, and industry peers to jointly attack these problems and don’t believe the methodology used in this report reflects the seriousness of our collective commitment and efforts.”
The messaging app said it protects "teens" as it has no public profile photos, no public friends lists, and teens must have proactively added someone as a friend or have them in their phone contacts to receive a message.
It also shares a warning if another user who doesn’t share a certain number of mutual friends tries to contact them.
Snapchat uses technology to detect child sexual abuse images, including Google's Content Safety API.
If this content is reported or detected, Snapchat said they remove it, lock the violating account and report it to the authorities.