Skip to main content
On Air Now

I've seen how social media algorithms push extremist content - it's easier to find than you think

Social media is filled with horrifying content as algorithms prioritise shock and rage over quality, writes Rebecca Henrys

Share

I didn't quite believe that I would be able to access Islamic State content on social media so easily and certainly not within 30 seconds.
I didn't quite believe that I would be able to access Islamic State content on social media so easily and certainly not within 30 seconds. Picture: LBC

By Rebecca Henrys

I still remember the first time I was shown a low-quality video of a person being beheaded with a blunt knife.

Listen to this article

Loading audio...

I was in a chemistry classroom at my secondary school, sitting on a lab chair close to a wall, when my friend pulled up the video and showed it to me under the table.

It is a moment that has been burned into my brain for how graphic and disturbing it was. Even now, almost 14 years later, I can still remember it in great detail.

Violent and extremist content used to live on forums and on dedicated websites, or in my case, shared between people using Bluetooth on mobile phones.

Now, it can be found within 30 seconds of logging onto social media sites used by millions.

And once you start, it will just keep pushing it to you.

I didn't quite believe that I would be able to access Islamic State content on social media so easily and certainly not within 30 seconds.

But my interest was piqued when I heard Adam Hadley, founder of Tech Against Terrorism, tell a Home Affairs select committee that this is how straightforward it is.

So, I started digging.

I created an account on two popular social media sites as a teenage boy, with access to content for ages 13+.

Lo and behold, in less than a minute, I was seeing Islamic State propaganda.

Beheadings, suicide bombings, and calls to arms were right there for a child to see, alongside just general propaganda celebrating the group, which at the height of its power controlled territory that spanned almost half the size of the UK.

People sharing their support whilst showing their faces, executions recreated in Roblox, videos showing Islamic State leaders and sermons from clerics were just boldly there on the platform, footage of the infamous black flag of the terror organisation flying in the wind, and scenes glorifying men who had died in suicide bombings were rife.

It was also very clear how they were getting around any existing moderation.

Once I started viewing this content, the algorithms would continue to push it onto my feeds and suggest accounts to follow that shared terror propaganda.

I'm passionate about investigating this because I have seen the dark side of the internet that is hiding in plain sight, whether it's the misuse of social media by extremists or how AI can be used to make chemical weapons.

Social media used to be a place where you could chat with your friends, share memes, and connect with people over shared interests.

Now, it's filled with horrifying content as algorithms prioritise shock and rage over quality.

Big tech companies need to work with researchers who understand extremist content and know the techniques used by terror organisations to evade bans, so that children and vulnerable people aren't exposed to these groups.

____________________

Rebecca Henrys is an online journalist for LBC.

LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email opinion@lbc.co.uk