Clare Foges 6pm - 9pm
Huw Edwards received indecent images on Whatsapp: Tech companies must do more to keep platforms safe
16 September 2024, 14:00 | Updated: 16 September 2024, 14:43
Huw Edwards’ downfall is complete.
Listen to this article
Loading audio...
From trusted and beloved news host, a fixture in people’s lives who broke news of some of the most significant events in recent history, he is now a convicted criminal, guilty of having some of the most extreme child sexual abuse imagery.
The revulsion with which people now view him and his offences tells us a lot about how we view a topic which is hard to deal with; the sharing of imagery of children suffering sexual abuse – including Category A imagery which can depict rape, penetration, or even sexual torture.But at the heart of each and every one of those images and videos is a child.
These are real children who have suffered horrendous and degrading abuse, and who continue to suffer every time that material is shared – never knowing how many people have pored over that material online, never being able to fully move on from what was done to them.
And where do criminals go to share this material? This material which causes daily pain to victims, and which sends perpetrators on a dangerous downwards spiral which destroys reputations, families, careers, and lives?
It’s WhatsApp. An app on millions of phones and devices. A popular service which ordinary people trust, every day.
People do not expect this platform to be a safe haven for criminals sharing this sort of abuse but, as it stands, there is nothing to stop that exact imagery being sent again on WhatsApp.
This is because it is an end-to-end encrypted messaging service, where even the company itself cannot see, let alone block, the criminal files being shared.
Though tech exists which can stop known child sexual abuse being shared in these platforms, Meta chooses not to use it.
So now, even though we know about this imagery, the mechanisms are not in place to stop similar offences taking place in the future.
That’s what I’d like to see changed. This is a solvable problem and doing nothing here is a choice.
There are tried, trusted, and effective methods to detect images and videos of child sexual abuse and prevent them from being shared in the first place.
But in WhatsApp, these safeguards are effectively switched off, with no alternative measures in place.
This was a technology-enabled crime against children, and everyone, including big internet companies and platforms, owe it to those victims to make sure their imagery cannot spread even further. At the moment, Meta is choosing not to.
Edwards is, sadly, not unusual. In the UK, despite hosting less than one percent of all known child sexual abuse imagery, we have a problem with high demand for this criminal material.
Finnish child protection group Suojellaan Lapsia has highlighted how predators and perpetrators actively prefer to use end to end encrypted apps to access and share content, and even to contact child victims because they’re less likely to get caught.
It's time for that to change, and time for WhatsApp, and other end to end encrypted services, to make it clear they will not accept this abuse of their platforms. It’s time to tell children we won’t allow safe places for their abuse to be shared around, and it’s well within our grasp.
________________
Dan Sexton is the Chief Technology Officer at the Internet Watch Foundation (IWF), an organisation working to stop the repeated victimisation of people abused in childhood and make the internet a safer place, by identifying & removing global online child sexual abuse imagery.
LBC Views provides a platform for diverse opinions on current affairs and matters of public interest.
The views expressed are those of the authors and do not necessarily reflect the official LBC position.
To contact us email views@lbc.co.uk