Ofcom’s new online harms rules for social media firms disappoint campaigners

16 December 2024, 12:14

A girl using a mobile phone
Girl using mobile phone. Picture: PA

Platforms now have until March to comply with the new Online Safety Act rules or face large fines.

The first set of new online safety rules legally requiring social media and other sites to take action against illegal content have been published by Ofcom.

The regulator said platforms now have three months to assess the risk of their users encountering illegal content and implement safety measures to mitigate those risks, or face enforcement action if they fail to comply with their new duties once they come into force.

The first set of rules focuses on illegal harms – such as terror, hate, fraud, child sexual abuse and encouraging suicide – but one safety charity has criticised the publication, saying it will allow “preventable illegal harm to continue to flourish”.

Ofcom has the power to fine firms up to £18 million or 10% of their qualifying global turnover under the Online Safety Act – whichever is greater – and in very serious cases can apply for sites to be blocked in the UK.

However, The Molly Rose Foundation, which was set up by the family of Molly Russell, who ended her life when she was 14 in 2017 after viewing suicide content on social media, said it was “astonished” and “disappointed” at Ofcom’s first set of codes.

“Ofcom’s task was to move fast and fix things but instead of setting an ambitious precedent these initial measures will mean preventable illegal harm can continue to flourish,” the charity’s chief executive Andy Burrows said.

“While we will analyse the codes in full, we are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold.

“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life.

“Today makes clear that there are deep structural issues with the Online Safety Act. The Government must commit to fixing and strengthening the regime without delay.”

Ofcom chief executive, Dame Melanie Dawes, said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.

“The safety spotlight is now firmly on tech firms and it’s time for them to act.

“We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.

“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”

Technology Secretary Peter Kyle said the publication of the first set of codes under the Online Safety Act was a “significant step” in making online spaces safer.

“This Government is determined to build a safer online world where people can access its immense benefits and opportunities without being exposed to a lawless environment of harmful content.

“Today we have taken a significant step on this journey.

“Ofcom’s illegal content codes are a material step-change in online safety meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world.

“If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites.

“These laws mark a fundamental reset in society’s expectations of technology companies.

“I expect them to deliver and will be watching closely to make sure they do.”

By Press Association

More Technology News

See more More Technology News

The icons for the smartphone apps DeepSeek and ChatGPT are seen on a mobile

Nations and tech firms to jostle for AI leadership at Paris summit

Nick Lees

Man who credits King over cancer diagnosis given pioneering robotic microsurgery

Ellen Roome with her 14-year-old son Jools Sweeney

Parents suing TikTok over children’s deaths ‘want answers’

The Apple logo in the window of an Apple store

Home Office orders Apple to let it access users’ encrypted files – report

Ellen Roome with her son Jools Sweeney

Bereaved families file US lawsuit against TikTok over access to children’s data

The OpenAI logo appears on a mobile phone in front of a computer screen with random binary data

OpenAI taking claims of data breach ‘seriously’

There are concerns over how technology is aiding the abuse of women (Alamy/PA)

Deepfake abuse crackdown a ‘really important blow in battle against misogyny’

The Football Manager 25 logo on a light purple background

Football Manager 25 cancelled after delays

Football Manager 25 has been cancelled after being hit by delays

Football Manager 25 cancelled after several delays

Carsten Jung, head of AI at the IPPR, warned that politics 'needs to catch up' with the implications of AI (PA)

AI could replace 70% of tasks in computer-based jobs, study says

General view of IMI headquarters at Lakeside, Birmingham Business Park, Birmingham.

Engineering group IMI latest UK firm to be hit by cyber attack

A person's hands on the keyboard of a laptop

PSNI exploring use of AI to analyse mobile phone evidence

A screenshot of the homepage of AI chatbot DeepSeek, showing a warning message about new users being unable to register for the app

DeepSeek reopens new user sign-ups despite ongoing security concerns

A Google logo on the screen of a mobile phone, in Londons

Google axes diversity hiring targets as it reviews DEI programmes

A person’s hand pressing keys of a laptop keyboard

UK to get new cyber attack severity rating system

People working at computers

Capital raised by tech start-ups under Government scheme doubles