Extremist content is everywhere - the Online Safety Act won’t stop terrorism unless it targets smaller platforms

24 January 2025, 12:14

Extremist content is everywhere—the Online Safety Act won’t stop terrorism unless it targets smaller platforms.
Extremist content is everywhere—the Online Safety Act won’t stop terrorism unless it targets smaller platforms. Picture: Getty

By Stuart Macdonald and Sean McCafferty

For the past year the VOX-Pol Institute has been monitoring nearly 100 different terrorist channels online. What we have seen users share in these spaces is alarming.

Listen to this article

Loading audio...

Videos glorifying and inciting violence. Livestream footage from attacks. Graphic photosets. Magazines and manifestos offering justifications for acts of extreme violence. This content spans a variety of ideologies – including but not limited to violent jihadism, the extreme right wing and incels (involuntary celibates) – as well as a growing number of individuals consuming and sharing terrorist propaganda from across the ideological spectrum.

Perhaps most concerning of all is the availability of instructional guides for those wanting to build explosives, make poisonous substances, create their own firearms using 3D printers, and evade detection. In fact, the propaganda calls for supporters to plan attacks using whatever they have available, such as knives or vehicles.

For those that are tech savvy – or even just determined – accessing extremist material is alarmingly easy. During 2024, we collected almost 50,000 posts that shared terrorist content. That’s roughly 130 posts every day. And this only includes materials that clearly fall within official definitions of terrorist content. It excludes vast amounts of extremist content that is more indirect and amorphous, yet still potentially harmful.

While the biggest platforms tend to receive the most scrutiny, the problem is a much wider one. We found terrorist content on more than 250 platforms. Not just the social media giants. Many were file-sharing sites, fringe social media, and terrorist-operated websites. We have also seen terrorist groups experimenting with other service types, such as podcast sites, encrypted messaging platforms and even using generative AI to create extreme images.

Many platforms actively try to disrupt the flow of terrorist content. But others are more haphazard in their approach or even take no enforcement action at all. In some instances, this may be due to a lack of capacity. Others try to justify loose regulatory standards by claiming values such as user privacy or freedom of speech.

Still others explicitly oppose content moderation, even welcoming extremist content. The result is that users have stable access to content that glorifies and incites violence, and provides explicit guidance on how to plan and carry out attacks.

So regulators have an important task. Ofcom is tasked with implementing the Online Safety Act. It is still early days, but it is clear from our work that focussing on the tech giants alone will not be enough. Doing so will simply allow terrorist content to continue to thrive on lesser-known platforms – and may even drive extremists to these darker spaces in the online ecosystem that are more difficult to monitor and disrupt.

But implementing an effective response is easier said than done. Issuing threats of enforcement action against small companies with few resources will do little to help them comply. Especially when they are also trying to deal with many other types of online harm, such as child sexual abuse material.

There are also jurisdictional challenges. Different countries apply different standards. We have witnessed terrorist groups experience disruption on one site, then migrate to a different platform based in another country, often Russia. This type of fragmentation looks set to increase with Elon Musk’s shake-up at X and Mark Zuckerberg’s recent comments on Facebook.

The size of the challenge is clear. But the past year has also shown us the importance of the task. Content that glorifies, encourages, justifies and enables extreme violence should not be readily discoverable to those seeking it out. We need an effective, industry-wide response that combines education and capacity-building with strong enforcement action where necessary.

________________

Stuart Mcdonald is Co-Director of Swansea University’s Cyber Threats Research Centre (CYTREC) and Coordinator of the VOX-Pol Network.

Sean McCafferty is a member of the Cyber Threats Research Centre (CYTREC) at Swansea University.

LBC Views provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email views@lbc.co.uk