Online child sex abuse content ‘can be stopped without harming encryption’

21 July 2022, 07:04

Online abuse AI algorithm
Online abuse AI algorithm. Picture: PA

A new paper published by UK cybersecurity experts has proposed new ways of tackling abuse material without protecting encryption.

UK cybersecurity experts have laid out a range of possible ways that child sexual abuse material could be detected within encrypted services that would still protect user privacy.

Technical experts from GCHQ and the National Cyber Security Centre (NCSC) have published a paper on the issue, which they say they hope will help the debate around the issue of end-to-end encrypted platforms and child online safety.

The Government and child safety campaigners have previously warned that the increased use of encrypted messaging services, such as WhatsApp, makes it more difficult for law enforcement to detect online abuse – while privacy groups have argued that forcing platforms to create workarounds to encryption threatens everyone’s personal privacy and safety.

Last year, Apple announced and then subsequently delayed a tool that would scan photos a user attempted to upload to their iCloud library – as part of tackling child sexual abuse material – after backlash from some over potential privacy implications.

In their paper, NCSC technical director Dr Ian Levy and Crispin Robinson, GCHQ’s technical director for cryptanalysis, lay out several ways in which technology could be used to aid the detection of child sexual abuse material without breaking encryption.

The proposals include storing digital fingerprints of known abuse material on a user’s device and having the device detect if any known material is sent or received or using on-device artificial intelligence to scan for language in the text which could indicate a link to child sexual abuse, or scanning images and video for known material.

In some of these cases, the authors suggest this data would never leave the person in question’s device but would instead be used to flag concerns to the user and prompt them to report it themselves, while others suggest varying forms of secure external analysis.

However, they acknowledge that many of their proposals currently have flaws and would require work from all engaged parties on the issue to make them feasible and technically sound.

The authors say the paper is not part of any Government policy or a set of rules or requirements they believe must be introduced, but instead a way of creating a more informed debate around the subject.

Dr Levy and Mr Robinson also acknowledge that further research and technical work were needed on the issue, saying there was “undoubtedly work to be done” to examine and understand the impact of any of their proposals.

“We hope this paper will help the debate around combating child sexual abuse on end-to-end encrypted services, for the first time setting out clearly the details and complexities of the problem,” the authors write in the paper.

“We hope to show that the dual dystopian futures of safe spaces for child abusers and insecurity by default for all are neither necessary or inevitable.

“We have written this paper having spent many years combating child abuse, but also in the technical domains of cryptography and computer security.”

Child safety campaigners have praised the new paper.

Andy Burrows, head of child safety online policy at the NSPCC, said: “This important and highly credible intervention breaks through the false binary that children’s fundamental right to safety online can only be achieved at the expense of adult privacy.

“The report demonstrates it will be technically feasible to identify child abuse material and grooming in end-to-end encrypted products.

“It’s clear that barriers to child protection are not technical, but driven by tech companies that don’t want to develop a balanced settlement for their users.

“The Online Safety Bill is an opportunity to tackle child abuse taking place at an industrial scale.

“Despite the breathless suggestions that the Bill could ‘break’ encryption, it’s clear that legislation can incentivise companies to develop technical solutions and deliver safer and more private online services.”

By Press Association

More Technology News

See more More Technology News

Nurse wears NHS goggles

New smart glasses to help nurses maximise time with patients on home visits


Steve Jobs’ Apple-1 computer prototype auctioned for nearly 700,000 dollars

Social media stock

Influencer Andrew Tate banned from Facebook and Instagram

Janet Jackson

Janet Jackson song had power to crash laptops, Microsoft reveals

Brits are scrambling to update their Apple devices after a major security flaw was found

Millions told to update Apple tech 'right now' after major security issue leaves devices vulnerable to hackers

An Apple iPhone 13

Owners urged to update Apple devices quickly but ‘stay calm’ over security flaw

Cars on motorway

Self-driving vehicles ‘on UK roads by 2025’ under new Government plans

child using a laptop

Next prime minister must deliver Online Safety Bill, urge campaigners

Coronavirus – Tues Dec 14, 2021

NHS Covid Pass fixed after glitch leaves travellers without vaccination records

iPhones and iPads are among the affected devices

Apple reveals serious security issue with iPhones, iPads and Macs that could give hackers complete control

A woman using a smartphone

EE blocking ‘one million scam calls a day’

Moon Rocket

Nasa’s moon rocket moved to launch pad for first test flight

Conservative leadership bid

Sunak heavily outspends Truss on Facebook adverts

Elon Musk with the Manchester United logo

Elon Musk clarifies ‘joke’ about buying Manchester United


Younger people ‘watching seven times less traditional TV than older viewers’


Staffordshire water company confirms cyber attack