
Ian Payne 4am - 7am
27 April 2025, 21:06 | Updated: 27 April 2025, 21:23
As the Member of Parliament for Rotherham and a longstanding Parliamentary Champion of the Internet Watch Foundation (IWF), I have dedicated much of my career to the fight for stronger child protection measures.
One of the most pressing challenges I've been tackling for over a decade is the rise of child sexual abuse material (CSAM) online. Despite significant progress, the scale of this problem continues to grow.
In 2024, the IWF, the frontline against child sexual abuse in the UK, confirmed a record-breaking 291,273 reports of child sexual abuse imagery – an increase of 6% compared to 2023. This means every 108 seconds, a new report emerges showing a child being sexually abused. Each of these reports could involve hundreds, even thousands, of individual images or videos of abuse.
Troublingly, 97% of the reports in which victims’ sex was recorded showed the sexual abuse of girls only. This stark figure highlights the urgent need for increased focus on protecting young girls, who continue to be disproportionately affected by online abuse.
The rise of Artificial Intelligence (AI) has also made the creation of child sexual abuse material more widespread, allowing offenders to generate hundreds of illegal images with just a few clicks. In 2024, the IWF saw a staggering 380% surge in reports involving AI-generated child sexual abuse material.
This technology is causing real-world harm. Survivors of abuse are being re-victimised, as images of their trauma are repurposed, manipulated, or even used to train AI models. Photographic images of children are being digitally altered to become more extreme. In 2024, almost 40% of the AI-generated material reported to the IWF was rated as Category A – the most severe level.
In the last year, the Government has taken important strides to address this issue, including recent provisions introduced in the Crime and Policing Bill that will criminalise AI child sexual abuse image generators and guides. These new measures are a welcome response, and I’m pleased that action has been taken in line with the recommendations I made to the Minister last year. However, while this is a vital step forward, it is not enough. AI tools are constantly evolving, and many harmful platforms still fall outside the scope of existing regulations, including the Online Safety Act.
To truly protect children, we need stronger safeguards in place before these AI models even reach the market. Risk mitigation strategies must be built into these technologies from the start, and companies should be required to perform thorough assessments to ensure their products cannot be used to generate illegal content. The forthcoming AI Bill is the ideal vehicle for implementing these necessary protections to safeguard our children.
Additionally, the increasing availability and use of nudification apps is a growing concern, as they contribute to a rise in peer-to-peer abuse, with children sharing AI-generated sexual imagery of their peers and teachers. There is an urgent need to raise awareness and ensure that children and young people understand the potential harm of AI technologies. Schools ought to communicate the laws about the use of nudification tools to protect children from creating, making, or distributing abusive AI images, as well as services that young people can turn to if they need support.
A key tool in tackling online abuse is Report Remove, a joint initiative between the IWF and NSPCC, which gives children a secure and anonymous way to report sexual images or videos of themselves and have them removed from the internet. The sharing of nude images can have a devastating impact on children and this service provides young people with a way to take control of their online presence and safety.
In 2024, Report Remove received over 1,100 reports, a 44% increase from 2023. In total, over 7,600 images were flagged as illegal child sexual abuse material, with 64% of material involving boys. This type of service needs to be made available to every child in the world to tackle an issue that has global proportions.
I remain firmly committed to making the UK the safest place in the world to be a child online. Achieving this goal requires a united effort from policymakers, tech companies, law enforcement, and society as a whole. We must do more to protect all children online – particularly girls, who are disproportionately targeted and harmed.
Tackling the spread of child sexual abuse material is central to delivering the government’s pledge to halve violence against women and girls within a decade, and it is essential that we remain at the forefront of efforts to safeguard children from these rapidly evolving threats.
On 28th April 2025, I will be hosting the IWF in Parliament, where we will hold a drop-in session to mark the launch of their 2024 Annual Data and Insights Report. It’s an opportunity to understand the severity of this issue and renew our determination to keep children safe online. Together, we can build a future where every child can explore the digital world without fear of exploitation or abuse.
________________
Sarah Champion is the Labour MP for Rotherham and Chair of the International Development Select Committee.
LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.
The views expressed are those of the authors and do not necessarily reflect the official LBC position.
To contact us email views@lbc.co.uk