
Richard Spurr 1am - 4am
23 April 2025, 15:18
Children engage with technology instinctively and naturally.
Whether it’s learning through tablets, chatting with friends in games, or exploring the endless rabbit holes of YouTube, young people’s lives are increasingly online. And while we rightly celebrate the opportunities this brings, we must also face up to the dangers that lurk in the shadows of the internet — and do so with urgency.
The Internet Watch Foundation (IWF) has today published its Annual Data and Insights Report, and the findings are nothing short of harrowing. In 2024 alone, they confirmed a record-breaking 291,273 reports of child sexual abuse imagery. To put that into perspective: every 108 seconds, IWF analysts confirmed a report of a child being sexually abused.
Even more disturbing is the explosive rise in AI-generated child sexual abuse material, which is evolving at a deeply alarming pace. Reports to the IWF have surged by 380% since 2023, and a staggering 98% of actioned images and videos of AI CSAM contained imagery of girls.
This is not just a tech safety crisis; it is a gendered violence crisis. If Labour is to deliver on our mission to halve violence against women and girls within a decade, we must treat tech-facilitated abuse with the same seriousness as abuse that happens offline. The internet is not a separate world – it is now one of the most prolific frontlines for this kind of exploitation, and it's evolving at pace.
Back in October, I first raised the alarm on AI-generated child sexual abuse material with the Home Secretary. The signs were already there. The exponential growth since has been both predictable and avoidable. AI developers must put child safety before profit. That includes embedding safety-by-design principles, rigorous model testing, and transparency over how these tools are being used — and abused.
We must do better for our children. I was proud to be appointed Chair of the All-Party Parliamentary Group on Children’s Online Safety last year, and I am determined that the UK should lead the global effort to make the internet a safer place for young people.
There is hope — but only if we act decisively. The Online Safety Act is a landmark piece of legislation with the potential to transform how we protect children online. For the first time, tech platforms are now legally required to detect and remove child sexual abuse material. That’s a vital step forward and one that will save countless children from exploitation.
The regulator Ofcom must be bold, ambitious, and relentless in using the powers the Act provides. Tech giants that have spent billions refining algorithms to serve ads and curate content have the expertise, budgets, and global reach to solve these problems. We must ensure they are not only meeting the minimum requirements but are going beyond them, innovating to protect children as fiercely as they pursue profits.
As the IWF’s report makes painfully clear, these are not theoretical concerns. This abuse is happening to real children, right now.
We cannot, and must not, look away. Phones and screens are here to stay – we cannot put that genie back in the bottle. But we can and must build a digital world where children can play, learn, and grow without fear. That’s not just good policy. It’s a moral imperative.
________________
Gregor Poynton is the Labour MP for Livingston and Chair of the APPG on Children’s Online Safety
LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.
The views expressed are those of the authors and do not necessarily reflect the official LBC position.
To contact us email views@lbc.co.uk