Skip to main content
On Air Now

I saw what 13-year-old boys watch on TikTok - the videos are shocking but not a surprise, writes Henry Riley

Share

Henry Riley

By Henry Riley

I was shocked but not surprised by what our investigation found.

Listen to this article

Loading audio...

Back in March, I was horrified by some of the content we were being served – as a 13-year-old girl.

From content linked to depression, escalating to self-harm and eventually suicide, I couldn’t believe this content would be shown to a 13-year-old.

Fast forward six months, and the intense debate surrounding the measures and implications of the Online Safety Act has been ubiquitous in public discourse.

In that time, children’s safety on social media has dominated forums, debates and opinion polls – you’d expect, given the clamour to improve a dire situation, that things would have hopefully improved?

Alas, not really.

I set up my account as a 13-year-old boy, using an AI-generated image and a generic username created by ChatGPT.

We decided not to ‘like’ or specifically search for videos, instead we would just watch some of the content we were served, and I clicked on the odd-hashtag of a video that came up on my ‘For You’ page.

It all started fairly well. Videos of sunsets, funny grandparent videos and content creators of similar ages with videos about dreading going back to school. Dare I say it was quite entertaining scrolling through cat videos and lads on holiday thinking they’d seen sharks.

But it was incredible how quickly that changed. The next day, within minutes of swiping, I was inundated with far-right content. AI-generated videos of people shouting ‘go home’ at dinghy’s, chants of Keir Starmer’s a w*****, and a general sense of nastiness began to creep into my algorithm.

At one point, I was met with a video of migrants on a small boat singing the much-cited Jet2 holiday advert, but with the racial slur of the p-word.

I saw what 13-year-old boys are watching on TikTok - the videos are shocking but not a surprise, writes Henry Riley.
I saw what 13-year-old boys are watching on TikTok - the videos are shocking but not a surprise, writes Henry Riley. Picture: Alamy/LBC

The next day, it changed to neo-nazi groups and famous fascists, as well as sexualised and misogynistic content.

This continued over days and completely dominated my feed. I couldn’t escape divisive, racist and offensive content. Much of which I can’t actually describe on here.

Videos of similarly aged girls mocking 13-year-old boys were commonplace, along with excessive gym content, and how to ‘flirt’ with girls of the same age.

Every day, I was hit with a warning that I’d been on TikTok too long… hoping that this would force me away from doom scrolling, it merely meant that I had to – at one point – input the code ‘1234’ to continue scrolling.

Next to no safeguards.

Was I exposed to porn? No.

Andrew Tate? No.

Suicidal content? No.

But I was exposed to leading neo-nazis, fascists, given dates for far-right marches, it was impossible to escape and deeply disturbing.

I don’t know the codes of practice by which TikTok abide, and I’m not necessarily here to cast aspersions.

Just ask yourself: Is this the sort of content a 13-year-old should be seeing?

____________________

Henry Riley is a reporter and presenter for LBC.

LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email opinion@lbc.co.uk

___________________

A TikTok spokesperson told LBC: "We're committed to making TikTok a safe, positive space for teens and we build the strongest safeguards into their accounts by default — from industry-leading screen time limits, private accounts, and easy-to-use tools for parents.

Of the content we remove for breaking our rules, 99% is found before it is reported to us and this single account doesn’t reflect the typical teen experience on TikTok."