
Vanessa Feltz 3pm - 6pm
11 March 2025, 23:28
Parents are set to be given the ability to block teenagers from using TikTok during specific times, such as family meals, school, at night or during a holiday.
Until now, TikTok and other social media platforms have enabled parents to set screen time allowances in hours and overnight, but not specific schedules during the day.
There is increasing concern from parents about social media and pressure from charities on regulators and the government over new social media laws.
Navigate parenthood with our news and expert advice, plus time-saving tips.
Other features enabled by the app will allow parents to see who their child follows, who is following the child and which accounts the child has blocked.
Users under the age of 16 will also have a meditation tool pushed into their feeds after 10pm to encourage them away from the app and towards sleep.
Read more: TikTok to introduce mindfulness tool for teenage users
Read more: LBC Investigates: What a 13-Year-Old Girl Sees on TikTok
LBC’s Henry Riley: what does TikTok show our kids? | Online Safety Day
The platform’s Wind Down mindfulness tool will automatically turn on if they are on the app after 10pm.
It will interrupt the For You feed with a full-screen takeover, playing calming music. If the user continues to scroll, a second, “harder to dismiss, full-screen prompt” will appear.
Val Richey, TikTok’s global head of outreach and partnerships, trust and safety, said: “It’s not just about hard blocks, it’s about building the skills to get through the online space.“
In countries where we’ve tested this already, most teens have chosen to keep these reminders on. In the coming weeks we’ll introduce guided meditation exercises to help teens transition from screen time to sleep."
Most social media apps have been introducing parental controls over the past few years in response to concerns. However, research has shown that the take-up of the controls has been low. TikTok does not publish the proportion of parents who use its Family Pairing features.
Child protection groups have criticised platforms for shifting the burden of responsibility to parents instead of making their products safer.
Dame Melanie Dawes, chief executive of Ofcom, which regulates social media platforms, told LBC: “What we don’t want to do is what some of the platforms would say — ‘As long as there are parental controls, then everything’s fine.’ And I would say, no, you’re not actually following your own responsibilities there.
“Parents need to be part of this. Children can do things to keep themselves safe. But above all, I want the platforms to make the service safer.”
She did add, however, that parents “had a role to play”, especially when their child signs up for a social media account.
Ofcom’s research has shown that many children on the platforms are under the legal age of 13 for data processing. At a launch event for the TikTok tools, which sit in the Family Pairing section of the app, a child psychologist suggested that 30 minutes per day was a good limit to set for a 13-year-old on TikTok.
Kirren Schnack said: “I generally recommend to the children that I work with and families and my kids [a limit of] 30 minutes, providing the other things that you need to do are done. Sometimes it might not even be that; it might be 30 minutes on a Wednesday and then the weekend is a bit different.”
Andy Burrows, chief executive of the Molly Rose Foundation, said: “Stronger parental controls are welcome but when children are receiving torrents of depressive content on TikTok you have to wonder if these announcements match the reality for them.“
Ofcom’s child safety codes need teeth to ensure that children are no longer left to protect themselves on social media. Ultimately, these announcements show regulation can work but it needs to be stronger to deliver more than incremental change. Our polling shows that parents would cheer the prime minister on if he chose bold and necessary action to strengthen online safety laws.”
A report on LBC this week showed how an account of a 13-year-old girl on TikTok was served content about self-harm, eating disorders and suicide ideation.
In November 2023 the Molly Rose Foundation found that almost half of content it analysed on TikTok and Instagram using well-known suicide and self-harm hashtags was potentially harmful.