
Ian Payne 4am - 7am
1 March 2025, 10:38
Headlines frequently highlight the dangers that the internet, particularly social media, poses to under-18s.
Concerns are not only about the content algorithms push to impressionable minds but also about the age of the children viewing this content.
While most social media sites require users to be at least 13 years old, the Advertising Standards Authority’s "100 Children Report" reveals that children often lie about their age to access these platforms. As a result, much younger children are exposed to content and adverts for age-restricted products, which can negatively impact their mental health and contribute to self-harm, suicide, or crime.
Social media companies must start taking responsibility for the content their algorithms propagate. Traditionally, they have claimed they cannot police content, placing the onus on those who upload it to ensure its safety. This approach is no longer tenable. The Online Safety Act mandates that social media entities protect users, especially young people, from online harms. Tech companies need to increase moderation, transparency and improve their safeguarding.
Ofcom is actively working to provide guidance on enforcing the Online Safety Act, recently proposing measures for tech firms to tackle online harms against women and girls. Crucially, if Ofcom finds that a service provider has breached its obligations under the Act, it can impose penalties of up to 10% of the company's worldwide revenue or £18 million, whichever is greater, and require remedial action.
While these measures are welcome, Ofcom needs to act swiftly. With rapid advancements in technology and AI, as well as a shifting political landscape, Ofcom must ensure its guidance is forward-thinking and flexible, encouraging the positive use of new technologies to counter online harm. Importantly, Ofcom must start imposing significant fines for non-compliance to give the Online Safety Act the teeth it needs to effect meaningful change.
If Ofcom cannot keep pace with technology, to ensure social media companies implement safety provisions to protect children online, we may see increased pressure to implement a blanket ban on social media for under-16s, as recently seen in Australia.
________________
Iona Silverman is a partner at law firm Freeths.
LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.
The views expressed are those of the authors and do not necessarily reflect the official LBC position.
To contact us email views@lbc.co.uk