Q&A: Ofcom, the Online Safety Act, and codes of practice for social media

16 December 2024, 14:04

A young child lies on a couch while playing on a smartphone
Online Safety Act. Picture: PA

Ofcom has published the first set of new rules to regulate social media.

Online safety regulator Ofcom has published its first set of codes and guidance under the Online Safety Act, setting out the duties tech firms must comply with regarding illegal harms.

The landmark safety laws will require platforms to put a range of safety measures in place which Ofcom says will help better protect users – including better moderation, built-in safety tools, clear ways to report harmful content and clear accountability to senior staff over safety compliance issues.

Here is a closer look at Ofcom’s announcement and the wider legislation.

– What is the Online Safety Act?

Passed in late 2023, the Online Safety Act is the UK’s first major legislation to regulate social media, search engine, messaging, gaming, dating, pornography and file-sharing platforms.

At its core, the Act places a range of new safety duties on sites, which will compel them to protect users from illegal and other harmful content.

It will do so by putting robust safety features in place to prevent the content appearing on sites in the first place, also acting swiftly to remove it when it does.

The new duties will be set out in a range of codes of practice and other guidance published by Ofcom over the coming months, with each one focusing on a specific content area.

The Act gives Ofcom the power to fine firms that fail to meet these duties – potentially up to billions of pounds for the largest sites – and in serious cases can seek clearance to block access to a site in the UK.

– What has Ofcom published now?

The regulator has released its first codes of practice, which specifically focus on illegal harms online.

This is content such as that linked to terrorism, hate, fraud, child sexual abuse and assisting or encouraging suicide, Ofcom says.

The codes are designed to help platforms comply with the new rules by setting out best practice on the measures and structures they should have in place by the time the duties are expected to come into force in three months’ time.

The largest platforms will be expected to do the most to protect users, in particular children.

The codes of practice outline that sites should have senior staff accountability for safety, have strong moderation and reporting tools in place, as well as robust measures to protect children from abuse and exploitation online.

The first set of codes also call for measures to be put in place to tackle pathways to online grooming, use automated tools to detect child sexual abuse material and take steps to protect women and girls, identify fraud and remove terrorist accounts.

– What has been the response?

While many have welcomed steps being taken to better regulate social media, some campaigners have expressed their disappointment at Ofcom’s approach, arguing it has not been forceful enough.

The Molly Rose Foundation, which was set up by the family of Molly Russell, the 14-year-old who ended her life in 2017 after viewing suicide content on social media, said it was “astonished” and disappointed” with Ofcom’s codes, adding there is not “one single targeted measure” for social media sites to “tackle suicide and self-harm material that meets the criminal threshold”.

Maria Neophytou, acting chief executive at the National Society for the Prevention of Cruelty to Children, said the charity was “deeply concerned that some of the largest services will not be required to take down the most egregious forms of illegal content, including child sexual abuse material”.

She said Ofcom’s proposals will “at best lock in the inertia to act and at worst create a loophole which means services can evade tackling abuse in private messaging without fear of enforcement”.

– What has Ofcom said about the codes?

Dame Melanie Dawes, Ofcom chief executive, said the introduction of the Online Safety Act meant that sites would no longer be “unregulated, unaccountable and unwilling to prioritise people’s safety over profits”.

“The safety spotlight is now firmly on tech firms and it’s time for them to act,” she said.

“We’ll be watching the industry closely to ensure firms match-up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”

Ofcom has said it will continue to roll out further codes and proposals in 2025, including more on the response to child sexual abuse material.

– What happens next?

For tech firms, they now have until March to start putting Ofcom’s proposals into place on their sites to ensure they are in line with those aspects of the Online Safety Act as it begins to come into force.

Meanwhile, Ofcom has said it will continue to publish more codes of practice in the early months of next year on a range of other harms included in the Act, including guidance for pornography publishers expected in January, guidance on protecting women and girls in February, and details on additional protection for children around harmful content promoting suicide, self-harm and eating disorders in April.

By Press Association

More Technology News

See more More Technology News

The icons for the smartphone apps DeepSeek and ChatGPT are seen on a mobile

Nations and tech firms to jostle for AI leadership at Paris summit

Nick Lees

Man who credits King over cancer diagnosis given pioneering robotic microsurgery

Ellen Roome with her 14-year-old son Jools Sweeney

Parents suing TikTok over children’s deaths ‘want answers’

The Apple logo in the window of an Apple store

Home Office orders Apple to let it access users’ encrypted files – report

Ellen Roome with her son Jools Sweeney

Bereaved families file US lawsuit against TikTok over access to children’s data

The OpenAI logo appears on a mobile phone in front of a computer screen with random binary data

OpenAI taking claims of data breach ‘seriously’

There are concerns over how technology is aiding the abuse of women (Alamy/PA)

Deepfake abuse crackdown a ‘really important blow in battle against misogyny’

The Football Manager 25 logo on a light purple background

Football Manager 25 cancelled after delays

Football Manager 25 has been cancelled after being hit by delays

Football Manager 25 cancelled after several delays

Carsten Jung, head of AI at the IPPR, warned that politics 'needs to catch up' with the implications of AI (PA)

AI could replace 70% of tasks in computer-based jobs, study says

General view of IMI headquarters at Lakeside, Birmingham Business Park, Birmingham.

Engineering group IMI latest UK firm to be hit by cyber attack

A person's hands on the keyboard of a laptop

PSNI exploring use of AI to analyse mobile phone evidence

A screenshot of the homepage of AI chatbot DeepSeek, showing a warning message about new users being unable to register for the app

DeepSeek reopens new user sign-ups despite ongoing security concerns

A Google logo on the screen of a mobile phone, in Londons

Google axes diversity hiring targets as it reviews DEI programmes

A person’s hand pressing keys of a laptop keyboard

UK to get new cyber attack severity rating system

People working at computers

Capital raised by tech start-ups under Government scheme doubles