
Richard Spurr 1am - 4am
10 March 2025, 06:59 | Updated: 10 March 2025, 08:27
Online Safety Day is today, Monday 10th March, from 7am to midnight on LBC, available across the UK on Global Player on your smart speaker, iOS or Android device; on DAB digital radio and TV, at LBC.co.uk and in London on 97.3 FM.
LBC’s Henry Riley: what does TikTok show our Kids? | Online Safety Day
As part of LBC's Online Safety Day, we conducted an experiment to see exactly what kind of content a 13-year-old girl would be exposed to on TikTok. The results were eye-opening – and deeply concerning.
To ensure the experiment was as realistic as possible, LBC Reporter Henry Riley created a brand-new TikTok account, setting the birthdate to reflect that of a 13-year-old.
Using AI tools, he generated a profile picture and bio that would be typical of a young teenage girl. He then selected interests such as beauty, fashion, music, dance, and entertainment – categories that many girls of this age might choose.
Initially, the content seemed fairly innocuous: makeup tutorials, dance challenges, and lighthearted school-related videos. Two girls dancing in a classroom; an elderly man dancing in roller skates; a video of a baby. So far, the videos closely reflected the interests we selected when creating our account.
However, as we continued to scroll, we started seeing more relationship-based content, including posts about romantic texting habits and breakups – topics that might not be entirely suitable for a young teen. .
As we engaged with the platform, the algorithm adapted. Soon, the content shifted towards mental health discussions, some of which were helpful. But within just a few swipes, we found ourselves immersed in much darker territory.
Posts about self-harm, eating disorders, and even suicide ideation began to dominate the feed.
For instance, one post suggested ways to remove a blade from a pencil sharpener. Others talked about feeling invisible, struggling with body image, and experiencing extreme loneliness.
Shockingly, despite being served increasingly troubling content, we never once saw support messages from mental health charities or helplines.
Some of the other content recommended to us was cooking-related. First, we saw some innocent videos of someone sharing what they eat in a day, or showing a breakfast idea.
But even this took a dark turn - just one swipe later, the food content the algorithm served up turned into videos about eating disorders.
By the third day of the experiment, the first video that appeared on the For You page when we opened the app contained a message about suicide.
The second featured a post about using alcohol and smoking as coping mechanisms. Other videos explicitly referenced eating disorders, depression, and the idea of suffering in silence.
Most of these videos had tens of thousands of 'likes', while others had more than 100,000.
For a 13-year-old girl, this kind of content could be incredibly damaging. Social media should be a space where young users can find community, creativity, and positive influences.
Instead, it appears that TikTok’s algorithm is quick to push vulnerable teenagers towards extreme and harmful narratives.
If you were watching a television programme that tackled these issues, you’d likely see helpline numbers and signposts to support services.
But on TikTok, despite being flooded with distressing content, there were no such warnings or resources provided.
With the Online Safety Act now in force, platforms like TikTok are under increased scrutiny. But as our investigation shows, the reality is that a vulnerable 13-year-old could be swept into a dark and dangerous algorithmic rabbit hole in just a matter of days.
So, the big question remains: will TikTok and other social media platforms take responsibility and introduce effective safeguards?
And will the new legislation be enough to protect children online? Right now, the evidence suggests there’s a long way to go.