Meta’s ‘bonfire’ of safety policies a danger to children, charity says

28 January 2025, 07:14

A girl using a mobile phone
Girl using mobile phone. Picture: PA

The Molly Rose Foundation has warned social media risks becoming a haven for harmful content if safety regulation is not bolstered.

Meta’s recent “bonfire of safety measures” risks taking Facebook and Instagram back to where they were when Molly Russell died, the charity set up in her name has warned.

The Molly Rose Foundation said new online safety regulator Ofcom must strengthen incoming regulation in order to ensure teenagers are protected from harmful content online.

The charity was set up by Molly’s family after her death in 2017, aged 14, when Molly chose to end her life after viewing harmful content on social media sites, including Meta-owned Instagram.

Molly Russell
Molly Russell (Family handout/PA)

Earlier this month, boss Mark Zuckerberg announced sweeping changes to Meta’s policies in the name of “free expression”, including plans to scale back content moderation that will see the firm ending the automated scanning of content for some types of posts, instead relying on user reports to remove certain sorts of content.

Campaigners called the move “chilling” and said they were “dismayed” by the decision, which has been attributed to Mr Zuckerberg’s desire to forge a positive relationship with new US President Donald Trump.

Andy Burrows, chief executive of the Molly Rose Foundation, said: “Meta’s bonfire of safety measures is hugely concerning and Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died.

“Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms.

“If Ofcom fails to keep pace with the irresponsible actions of tech companies the Prime Minister must intervene.

“Amid a strategic rollback of their safety commitments, preventable harm is being driven by Silicon Valley but the decision to stop it in its tracks now sits with the regulator and Government.”

In a letter sent to Ofcom, the foundation has urged Ofcom to strengthen the Online Safety Act by bolster requirements around content moderation, including requiring firms to proactively scan for all types of intense depression, suicide and self-harm content.

It also urges the regulator to ensure that Meta’s new loosened policies around hate speech are not allowed to apply to children, and gain clarification on whether Meta can change its rules without going through traditional internal processes, after reports suggesting Mr Zuckerberg made the policy changes himself, leaving internal teams “blindsided” – something Ofcom should ensure cannot happen again, the foundation said.

In a statement, a Meta spokesperson said: “There is no change to how we define and treat content that encourages suicide, self-injury and eating disorders.

“We don’t allow it and we’ll continue to use our automated systems to proactively identify and remove it.

“We continue to have Community Standards, around 40,000 people working on safety and security to help enforce them, and Teen Accounts in the UK, which automatically limit who can contact teens and the types of content they see.”

Earlier this month, Molly’s father Ian, the chairman of the Molly Rose Foundation, told the Prime Minister that the UK was “going backwards” on online safety.

Mr Russell said in a letter to Sir Keir Starmer that Ofcom’s approach to implementing the Online Safety Act has “fundamentally failed to grasp the urgency and scale of its mission”, and changes were needed to bolster the legislation.

The Molly Rose Foundation has also previously warned that Meta’s approach to tackling suicide and self-harm content is not fit for purpose, after research found the social media giant was responsible for just 2% of industry-wide takedowns of such content.

We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force

Ofcom

An Ofcom spokesperson said: “All platforms operating in the UK – including Meta – must comply with the UK’s online safety laws, once in force.

“Under the Online Safety Act, tech firms must assess the risks they pose, including to children, and take significant steps to protect them.

“That involves acting swiftly to take down illegal content – including illegal suicide and self-harm material – and ensuring harmful posts and videos are filtered out from children’s feeds.

“We’ll soon put forward additional measures for consultation on the use of automated content moderation systems to proactively detect this kind of egregious content.

“We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force.

“No one should be in any doubt about Ofcom’s resolve to hold tech firms to account, using the full force of our enforcement powers where necessary.”

By Press Association

More Technology News

See more More Technology News

People ride an upward escalator next to the Dior store at the Icon Siam shopping mall on June 12, 2024 in Bangkok, Thailand.

Luxury fashion giant Dior latest high-profile retailer to be hit by cyber attack as customer data accessed

A plane spotter with binoculars from behind watching a British Airways plane landing

‘Flying taxis’ could appear in UK skies as early as 2028, minister says

Apple App Store

Take on Apple and Google to boost UK economy, think tank says

A survey of more than 1,000 employers found that around one in eight thought AI would give them a competitive edge and would lead to fewer staff.

One in three employers believe AI will boost productivity, research finds

Hands on a laptop showing an AI search

One in three employers believe AI will boost productivity, research finds

Music creators and politicians take part in a protest calling on the Government to ditch plans to allow AI tech firms to steal their work without payment or permission opposite the Houses of Parliament in London.

Creatives face a 'kind-of apocalyptic moment’ over AI concerns, minister says

Ngamba Island Chimpanzee Sanctuary on Lake Victoria, Uganda

Chimps use medicinal plants to treat each other's wounds and practice 'self-care' as scientists hail fascinating discovery

Close up of a person's hands on the laptop keyboard

Ofcom investigating pornography site over alleged Online Safety Act breaches

The Monzo app on a smartphone

Monzo customers can cancel bank transfers if they quickly spot an error

Co-op sign

Co-op to re-stock empty shelves as it recovers from major hack

The study said that it was often too easy for adult strangers to pick out girls online and send them unsolicited messages.

Social media platforms are failing to protect women and girls from harm, new research reveals

Peter Kyle leaves 10 Downing Street, London

Government-built AI tool used to cut admin work for human staff

In its last reported annual headcount in June 2024, Microsoft employed 228,000 full-time workers

Microsoft axes 6,000 jobs despite strong profits in recent quarters

Airbnb logo

Airbnb unveils revamp as it expands ‘beyond stays’ to challenge hotel sector

A car key on top of a Certificate of Motor Insurance and Policy Schedule

Drivers losing thousands to ghost broker scams – the red flags to watch out for

Marks and Spencer cyber attack

M&S customers urged to ‘stay vigilant’ for fraud after data breach confirmed