Samaritans launches guidelines for tech firms on managing suicide content

10 September 2020, 00:04

Social media apps on a smartphone
Social media stock. Picture: PA

The charity has worked with the Government and tech giants to create new industry standards on handling self-harm content.

New guidelines for internet platforms on how to safely manage self-harm and suicide content have been published by Samaritans in an effort to set industry standards on the issue.

To mark World Suicide Prevention Day, the charity has released the new guidance and a new advisory series for internet platforms, which have been created in collaboration with the Government and tech giants such as Facebook, Google and Twitter.

The charity said the initiative is the first time major online companies operating in the UK have been brought together to develop guidance and industry standards, with the aim of tackling the issue of harmful online content while improving access to support resources.

It said the new resources are not just aimed at large social media sites, but any platform which hosts user-generated content.

Social media data
Molly Russell took her own life in 2017 after viewing harmful images online (Family handout/PA)

Concerns have been raised about self-harm and suicide content online, particularly how platforms handle such content and its impact on vulnerable users, especially young people.

Research from Samaritans and the University of Bristol found that young people use the internet to inform self-harm and suicide attempts.

The study found that at least a quarter of young people who present to hospital after self-harming with suicidal intent have used the internet in connection with their attempt.

Fears about the impact of social media on vulnerable people have also increased amid cases such as that of 14-year-old schoolgirl Molly Russell, who took her own life in 2017 and was found to have viewed harmful content online.

Molly’s father, Ian, who now campaigns for online safety, has previously said the “pushy algorithms” of social media “helped kill my daughter”.

The Government is currently preparing its Online Harms Bill, which is expected to bring in a regulator for internet companies and introduce a duty of care to users to which firms must adhere to, or face large fines and other penalties.

Samaritans assistant director of research and influencing Jacqui Morrissey said: “The internet is an integral part of our lives and an invaluable resource, but the challenge with this is making sure that online content is shared in a safe way, particularly for those who are vulnerable.

“Whilst we have seen steps in the right direction over the last 18 months, we still think there is further to go, and we need all platforms and sites to be taking this seriously.

“The guidance isn’t just for large global companies – any company that hosts user-generated content needs to think about how this type of content may appear on their platform and how they will respond to it.

“These industry standards will enable more effective removal of harmful content, whilst also improving access and engagement with content that can be a source of support. We want all sites and platforms to recognise that self-harm and suicide content has potential for serious harms.

“We also want them to understand what that means for their platform, take action to minimise access to harmful content and create more opportunities for support in the online environment.

“Global companies can also use these guidelines more broadly across other countries that they operate in and apply them to their global policies rather than just the UK ones.”

Tara Hopkins, head of public policy, Europe Middle East and Asia, at Instagram – which is owned by Facebook – said the issue is “complex”, with “devastating consequences”, and that the tech giant wants to do “everything we can” to keep users safe.

“We try to strike the delicate balance of giving people the space to talk about mental health, while protecting others from being exposed to harmful content,” she said.

“That’s why we’re grateful to partners like Samaritans for helping us develop our existing content policies, and welcome these new guidelines to help the tech industry address these issues as sensitively and effectively as possible.”

Samaritans also announced that it is launching an online harms advisory service, which will provide specialist advice to all sites and platforms hosting user-generated content and how to respond to self-harm and suicide material.

The service and the new guidelines form part of the charity’s Online Harms Programme, a three-year project led by Samaritans with support from the Department for Health and Social Care and tech firms Facebook, Instagram, Google, YouTube, Twitter and Pinterest.

Mental health and suicide prevention minister Nadine Dorries said: “We have seen over the last few months how social media has the power to bring us together and connect us.

“However, all too often those who are vulnerable can all too easily access damaging content that promotes self-harm and suicide.

“Online platforms hosting user-generated content have a serious responsibility to help tackle the issue of harmful online content, while also working to improve access to content that can offer support, and I would urge them to take immediate action by implementing this incredibly helpful guidance.”

Brian Dow, deputy chief executive of the charity Rethink Mental Illness and co-chairman of the National Suicide Prevention Alliance, said: “World Suicide Prevention Day is particularly important in a year when the impact of the Covid-19 pandemic has truly tested the nation’s mental health.

“Despite the anticipated surge in people experiencing mental health problems, an increase in the number of suicides is not an inevitable scenario. We must remember that suicide is preventable and it’s crucial the right support is in place for people at greater risk.

“The reasons why someone might decide to take their life are complex, but the pandemic is exacerbating many of the known factors which can put people at greater risk of reaching crisis, such as financial problems, social isolation or difficulties accessing health and social care.

“Given the likely long-term impact of Covid-19, we need now more than ever for the Government to prioritise the nation’s mental health in its plans for recovery with a cross-government mental health strategy.”

By Press Association

More Technology News

See more More Technology News

Elon Musk in 2024

X may start charging new users to post, says Elon Musk

Musk suggested new users could be charged a small annual fee before posting

New X users face paying ‘small fee’ to combat ‘relentless onslaught of bots’, Elon Musk suggests

Cyber fraud

Creating ‘deepfake’ sexual images to be criminal offence under new legislation

A hand on a laptop

Criminals ramp up social engineering and AI tactics to steal consumer details

A woman’s hand presses a key of a laptop keyboard

Data regulator issues new guidance for healthcare sector on transparency

A Samsung sign spelled out in drones

Samsung takes top phone-maker spot back from Apple

Apple devices

Apple to allow iPhone repairs with used parts

TikTok research

TikTok launches campaign urging users to get MMR jab

WhatsApp has been criticised after lowering its age limit

Meta under fire after WhatsApp lowers age restriction from 16 to 13

Attendees pose for a group photograph at the AI safety summit

Next AI summit to be hosted by UK and South Korea in May

Social media apps

Meta under fire for ‘tone deaf’ minimum age change on WhatsApp

Social media apps

Q&A: Social media apps and minimum age requirements

Bafta Games Awards

Baldur’s Gate 3 dominates Bafta Games Awards with five wins

UK’s media habits

Government needs TikTok strategy to combat misinformation, say MPs

ChatGPT study

Growing concerns over AI foundation model market, competition regulator says

Tesco shopper

Tesco.com joins competitors in launching ‘Best of British’ page