Online child sex abuse content ‘can be stopped without harming encryption’

21 July 2022, 07:04

Online abuse AI algorithm
Online abuse AI algorithm. Picture: PA

A new paper published by UK cybersecurity experts has proposed new ways of tackling abuse material without protecting encryption.

UK cybersecurity experts have laid out a range of possible ways that child sexual abuse material could be detected within encrypted services that would still protect user privacy.

Technical experts from GCHQ and the National Cyber Security Centre (NCSC) have published a paper on the issue, which they say they hope will help the debate around the issue of end-to-end encrypted platforms and child online safety.

The Government and child safety campaigners have previously warned that the increased use of encrypted messaging services, such as WhatsApp, makes it more difficult for law enforcement to detect online abuse – while privacy groups have argued that forcing platforms to create workarounds to encryption threatens everyone’s personal privacy and safety.

Last year, Apple announced and then subsequently delayed a tool that would scan photos a user attempted to upload to their iCloud library – as part of tackling child sexual abuse material – after backlash from some over potential privacy implications.

In their paper, NCSC technical director Dr Ian Levy and Crispin Robinson, GCHQ’s technical director for cryptanalysis, lay out several ways in which technology could be used to aid the detection of child sexual abuse material without breaking encryption.

The proposals include storing digital fingerprints of known abuse material on a user’s device and having the device detect if any known material is sent or received or using on-device artificial intelligence to scan for language in the text which could indicate a link to child sexual abuse, or scanning images and video for known material.

In some of these cases, the authors suggest this data would never leave the person in question’s device but would instead be used to flag concerns to the user and prompt them to report it themselves, while others suggest varying forms of secure external analysis.

However, they acknowledge that many of their proposals currently have flaws and would require work from all engaged parties on the issue to make them feasible and technically sound.

The authors say the paper is not part of any Government policy or a set of rules or requirements they believe must be introduced, but instead a way of creating a more informed debate around the subject.

Dr Levy and Mr Robinson also acknowledge that further research and technical work were needed on the issue, saying there was “undoubtedly work to be done” to examine and understand the impact of any of their proposals.

“We hope this paper will help the debate around combating child sexual abuse on end-to-end encrypted services, for the first time setting out clearly the details and complexities of the problem,” the authors write in the paper.

“We hope to show that the dual dystopian futures of safe spaces for child abusers and insecurity by default for all are neither necessary or inevitable.

“We have written this paper having spent many years combating child abuse, but also in the technical domains of cryptography and computer security.”

Child safety campaigners have praised the new paper.

Andy Burrows, head of child safety online policy at the NSPCC, said: “This important and highly credible intervention breaks through the false binary that children’s fundamental right to safety online can only be achieved at the expense of adult privacy.

“The report demonstrates it will be technically feasible to identify child abuse material and grooming in end-to-end encrypted products.

“It’s clear that barriers to child protection are not technical, but driven by tech companies that don’t want to develop a balanced settlement for their users.

“The Online Safety Bill is an opportunity to tackle child abuse taking place at an industrial scale.

“Despite the breathless suggestions that the Bill could ‘break’ encryption, it’s clear that legislation can incentivise companies to develop technical solutions and deliver safer and more private online services.”

By Press Association

More Technology News

See more More Technology News

National Cyber Security Centre launch

National Cyber Security Centre names Richard Horne as new chief executive

The lights on the front panel of a broadband internet router, London.

Virgin Media remains most complained about broadband and landline provider

A person using a laptop

£14,000 being lost to investment scams on average, says Barclays

Europe Digital Rules

Meta unveils latest AI model as chatbot competition intensifies

AI technology

Younger children increasingly online and unsupervised, Ofcom says

Migrant Channel crossing incidents

Ministers will be told to use AI to screen migrants for threats, adviser says

Nothing smartphone

UK tech firm Nothing to integrate ChatGPT into its devices

The Google offices in Six Pancras Square, London

Google confirms more job cuts as part of company reorganisation

Person using laptop

Housing association reprimanded after residents’ data compromised

A screengrab of an arrest in connection with the LabHost website

Arrests made and thousands of victims contacted after scammer site taken offline

Social media apps on a smartphone

Three-quarters of public fear misinformation will affect UK elections – report

Businessman racing with a robot

TUC calls for AI to be regulated in the workplace

The ChatGPT website

AI chatbot ‘could be better at assessing eye problems than medics’

FastRig wingsail launch

Scottish-made wingsail set for sea tests after launch on land

Immigration

Rollout of eVisas begins as Government aims for digital immigration by 2025

Elon Musk in 2024

X may start charging new users to post, says Elon Musk