Online Safety Act ‘missing vital opportunity’ on suicide content

11 February 2025, 00:04

Samaritans said the Act does not go far enough (PA)
Children in Scotland subject to online sex abuse – 2024 figures. Picture: PA

The charity Samaritans has warned the current plans for enforcing the strictest content rules will leave ‘small but dangerous’ forums untouched.

Dangerous suicide and self-harm content will remain online despite the introduction of the Online Safety Act because of “big gaps around high risk content”, Samaritans has said.

The charity’s policy and influencing lead said online safety regulator Ofcom is currently choosing to “ignore” advice from safety groups around how it decides which websites are in scope of the new rules.

Jacqui Morrissey told the PA news agency that Ofcom has advised the Government to use site user numbers as the criteria for subjecting platforms to the strictest measures of the Act – which would require them to hide or remove such content even for users aged over 18 – but said that this approach would leave a number of “small, but very high risk platforms” not subjected to these toughest rules.

The charity has raised its concerns to coincide with Safer Internet Day (February 11).

“The Online Safety Act includes really important provisions within it that enables government to subject small but very high risk platforms to the strictest measures,” Ms Morrissey told PA.

“We are extremely disappointed that Ofcom have chosen to ignore our advice and the advice of many organisations in making the criteria for the strictest measures focused on the number of users.

“We know the Secretary of State has chosen to accept Ofcom’s advice, so basically, this means dangerous suicide and self-harm content is going to continue to be accessible to anyone over the age of 18.

“Turning 18 doesn’t make you any less vulnerable to this content. Lots of research shows the part that the internet has played in suicides.”

Ms Morrissey highlighted studies which had found internet-related use in suicide in 26% of deaths in those under 20 and 13% of deaths in 20-to-24-year-olds, and warned that the Government and Ofcom were “missing this vital opportunity”.

“These are all people who are over the age of 18 who are going to continue to come into contact with this very dangerous, very harmful content, which isn’t going to be covered by the Act at the moment,” she said.

“So we’re missing this really vital opportunity – Government and Ofcom have both said they recognise the concerns around small but very high risk services and they’ve said the decision on which services have to meet these highest measures needs to be based on evidence.

“Our concern is, why are they not listening to this evidence?

“We’ve heard all too often the role that online forums can play in someone taking their own life. We know coroners have clearly linked deaths to a particular site.

“So how much more evidence is needed to recognise this level of harm being caused on a site, and to ensure that it is subject to the strongest measures within the Act?

“We’re saying it is crucial that Government and Ofcom fully harness the power of the Online Safety Act to ensure people are protected from this dangerous content and to make sure that implementation is going to be effective as soon as possible.”

Ms Morrissey added that Samaritans believes that the level of risk – for example, if either Ofcom or a coroner “reasonably link one or more deaths” to a particular site – could be a suitable approach for deciding whether a platform should be placed in the highest category of the rules.

An Ofcom spokesperson said: “From next month, all platforms in scope of the Online Safety Act – including small but risky services – will have to start taking action to protect people of all ages from illegal content, including illegal suicide and self-harm material.

“We won’t hesitate to use our robust enforcement powers against sites that don’t comply with these duties – which could include applying to a court to block them in the UK in serious cases.

“Additional duties such as consistently applying terms of service will be a powerful tool in making larger platforms safer in due course.

“But they would do little to tackle the harm done by smaller, riskier sites – as many of them permit harmful content that is legal.”

A Government spokesperson said: “Suicide devastates families and social media companies have a clear responsibility to keep people safe on their platforms.

“We are determined to keep people safe online by swiftly implementing the Online Safety Act which will tightly regulate smaller platforms that spread hate and suicide content.

“Platforms will need to proactively remove illegal suicide material and, if accessed by children, protect them from harmful material – regardless of whether or not they are a categorised service.

“We expect Ofcom to use the full range of its powers – including fining and seeking court approval to block access to sites – if these platforms fail to comply.”

Ms Morrissey added that the charity was also “concerned” by the recent “rolling back” of certain moderation and fact-checking tools by some platforms, most notably Meta, which last month said it would replace third-party fact checkers in the US with user-generated community notes and would also loosen content moderation on some topics.

“We would hope that what the Online Safety Act does is create a minimum standard, but we would hope that platforms go beyond this and implement best practice,” she said.

“We know the Act isn’t going to cover lots of kinds of over-18s, dangerous content, but platforms can choose to do that – they can choose to make their platforms as safe as possible for their users.

“So we are concerned that we’re perhaps seeing some rolling back of progress with platforms recently around keeping people safe online. That puts us in, I think, a more precarious situation than we were a year ago.

“The internet can be such a positive place. It can be a really important space for people who are feeling suicidal to access really helpful information, to be able to talk to people who might be experiencing similar things in a safe and supportive environment.

“So tech platforms can be at the forefront of creating these safe spaces to enable safe conversations to be happening.”

Samaritans is available on 116 123 or at www.samaritans.org/how-we-can-help/contact-samaritan/

By Press Association

More Technology News

See more More Technology News

People ride an upward escalator next to the Dior store at the Icon Siam shopping mall on June 12, 2024 in Bangkok, Thailand.

Luxury fashion giant Dior latest high-profile retailer to be hit by cyber attack as customer data accessed

A plane spotter with binoculars from behind watching a British Airways plane landing

‘Flying taxis’ could appear in UK skies as early as 2028, minister says

Apple App Store

Take on Apple and Google to boost UK economy, think tank says

A survey of more than 1,000 employers found that around one in eight thought AI would give them a competitive edge and would lead to fewer staff.

One in three employers believe AI will boost productivity, research finds

Hands on a laptop showing an AI search

One in three employers believe AI will boost productivity, research finds

Music creators and politicians take part in a protest calling on the Government to ditch plans to allow AI tech firms to steal their work without payment or permission opposite the Houses of Parliament in London.

Creatives face a 'kind-of apocalyptic moment’ over AI concerns, minister says

Ngamba Island Chimpanzee Sanctuary on Lake Victoria, Uganda

Chimps use medicinal plants to treat each other's wounds and practice 'self-care' as scientists hail fascinating discovery

Close up of a person's hands on the laptop keyboard

Ofcom investigating pornography site over alleged Online Safety Act breaches

The Monzo app on a smartphone

Monzo customers can cancel bank transfers if they quickly spot an error

Co-op sign

Co-op to re-stock empty shelves as it recovers from major hack

The study said that it was often too easy for adult strangers to pick out girls online and send them unsolicited messages.

Social media platforms are failing to protect women and girls from harm, new research reveals

Peter Kyle leaves 10 Downing Street, London

Government-built AI tool used to cut admin work for human staff

In its last reported annual headcount in June 2024, Microsoft employed 228,000 full-time workers

Microsoft axes 6,000 jobs despite strong profits in recent quarters

Airbnb logo

Airbnb unveils revamp as it expands ‘beyond stays’ to challenge hotel sector

A car key on top of a Certificate of Motor Insurance and Policy Schedule

Drivers losing thousands to ghost broker scams – the red flags to watch out for

Marks and Spencer cyber attack

M&S customers urged to ‘stay vigilant’ for fraud after data breach confirmed