Skip to main content
On Air Now

Why online abuse continues to slip through the cracks

Securing the removal of harmful content is often slow, complicated, and, without legal support, near-impossible, writes Andrew Fremlin-Key

Share

Securing the removal of harmful content is often slow, complicated, and, without legal support, near-impossible, writes Andrew Fremlin-Key
Securing the removal of harmful content is often slow, complicated, and, without legal support, near-impossible, writes Andrew Fremlin-Key. Picture: Alamy
Andrew Fremlin-Key

By Andrew Fremlin-Key

Whether Louis Theroux’s latest documentary exposed or inadvertently legitimised harmful ideologies remains contested- but the growing prevalence of the ‘manosphere’ in public discourse is clear. Against this backdrop, amendments to the Crime and Policing Bill, introduced by Baroness Bertin in the House of Lords, have drawn heightened interest from campaigners increasingly concerned with the normalisation of violence against women and girls across online content.

Listen to this article

Loading audio...

One of the more controversial amendments has reignited a familiar debate: whether pornography platforms should be subject to stricter age and consent verification checks. The Government is likely to resist, arguing that existing laws are sufficient. Meanwhile, proposals to require tech platforms to remove non-consensually shared intimate images within 48 hours signal a shift in responsibility toward companies best placed to police harmful content.

However, this shift is only going to work if it is backed by swift, decisive enforcement. From a practitioner’s perspective, the real weakness in the current regime lies not so much in the absence of laws, but in the difficulty of making them work in practice. I have represented victims of serious online abuse, and have seen firsthand how securing the removal of harmful content is often slow, complicated, and, without legal support, near-impossible.

The process is labyrinthine. Victims are typically directed to platform-specific reporting systems and policies that are inconsistent, opaque, and, in many cases, the companies are slow to respond (if they respond at all). There is rarely a single, standardised route for all complaints. Instead, users must navigate fragmented interfaces, unclear categories, and automated triage systems that can misclassify or deprioritise what are serious harms.  This is not just administratively burdensome, but retraumatizing.

Accountability is another major gap. Platforms typically act as both gatekeepers and judges, with minimal external scrutiny of their decisions. Takedown requests are frequently rejected with little explanation and weak appeals processes. While some companies have improved their systems under public pressure, reliance on goodwill is not enough.

Content can now be created without any direct involvement from the person depicted, making traditional legal tests – designed around identifiable acts of recording or distribution – harder to apply.  In such cases, victims may struggle to even articulate why the content is unlawful under existing frameworks, let alone persuade a platform to remove it swiftly. The scale and speed at which fake content can be generated and republished exacerbate the problem.  It rapidly becomes a game of whack-a-mole, which can feel unwinnable without significant resources.

There needs to be clear enforceable obligations on response times, transparency, and user recourse to avoid new takedown laws becoming largely symbolic. Rights without remedies offer no protection. Even if the Government is confident that the legal framework is already robust enough to address these problems, it cannot credibly claim it is working.

At the same time, the ‘manosphere’ is undoubtedly gaining ground. In the current climate, views once dismissed as fringe are being repackaged as mainstream, even aspirational content- fueling a lucrative pyramid for the influencers who sit at its peak. We as a society must find a way to redress the balance. Legislation matters, but enforcement is key , and even then, it's not a panacea for all issues. Greater investment in public awareness, education, and training is vital. And, fundamentally, these issues are avoided because they feel uncomfortable or ‘un-British’ to discuss. That reluctance itself is outdated. And silence enables harm to flourish. Fundamentally, this is more than just a legal problem- it’s a societal one. And it demands a collective response.

________________

Andrew Fremlin-Key is a partner in the litigation and arbitration team at Withers, specialising in media, reputation and information disputes.

LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email opinion@lbc.co.uk