Iain Dale 7pm - 10pm
Online platforms 'could face multimillion-pound fines' over harmful content
15 December 2020, 00:00
Online platforms that fail to protect users or remove harmful content face multimillion-pound fines and being blocked in the UK under new laws, the Government has announced.
However, proposals for criminal liability for senior executives at non-compliant firms appear to have been scaled back, with the Government aiming to bring those powers into force through secondary legislation.
Ahead of publishing a full response to the Online Harms White Paper, Home Secretary Priti Patel and Digital Secretary Oliver Dowden said the proposed laws would create a "new age of accountability" for social media platforms.
Under the new rules, which the Government will bring forward in an Online Safety Bill next year, Ofcom - in its new confirmed role as regulator - will have the power to fine companies up to £18 million or 10% of global turnover, whichever is higher, for failing to abide by a duty of care to their users - particularly children and the vulnerable.
It will also have the power to block non-compliant services from being accessed in the UK, while the Government said it would also reserve the right to impose criminal sanctions on senior executives, powers it says it would not hesitate to bring into force through additional legislation if firms fail to take the new rules seriously.
The proposed legislation will apply to any company in the world hosting user-generated content online which is accessible by people in the UK or enables them to interact with others online.
A small group of high-profile platforms will face tougher responsibilities under a two-tier system, with Facebook, TikTok, Instagram and Twitter to be placed in Category 1 as the companies with the largest online presences and most features deemed high-risk.
In addition to being required to take steps to address illegal content and activity and extra protections for children who access their services, firms in this group will be asked to assess what content or activity on their platform is legal but could pose a risk of harm to adults, and to clarify what "legal but harmful" content they see as acceptable in their terms and conditions.
The legislation will require all firms in Category 1 to publish transparency reports detailing how they are tackling online harms.
Ms Patel said: "We are giving internet users the protection they deserve and are working with companies to tackle some of the abuses happening on the web.
"We will not allow child sexual abuse, terrorist material and other harmful content to fester on online platforms. Tech companies must put public safety first or face the consequences."
The scope of the new legislation will not include online articles and comment sections, as part of efforts to protect freedom of speech.
The Government said it was also working with the Law Commission on whether the promotion of self-harm should be made illegal.
In addition, it confirmed that it plans to include private communications such as instant messaging services and closed social media groups within the scope of the regulations.
It said the proposals will see Ofcom require firms to use highly accurate targeting technology to monitor, identify and remove illegal material such as that linked to child sexual exploitation and abuse.
The Government said it recognised the potential impact this could have on user privacy and added it would ensure the measures were only used as a last resort where other means had failed, and would be subject to legal safeguards to protect user rights
This area of the proposals is likely to lead to clashes with tech giants like Apple and Facebook, who use end-to-end encryption to boost privacy and hide some user content from even the firms themselves.
"I'm unashamedly pro tech but that can't mean a tech free-for-all. Today Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation," Mr Dowden said.
"We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.
"This proportionate new framework will ensure we don't put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives."
Fears about the impact of social media on vulnerable people have increased in recent years amid cases such as that of 14-year-old schoolgirl Molly Russell, who took her own life in 2017 and was found to have viewed harmful images online.
Molly's father, Ian, who now campaigns for online safety, has previously said the "pushy algorithms" of social media "helped kill my daughter".
Dr Bernadka Dubicka, chair of the child and adolescent faculty at the Royal College of Psychiatrists, said: "These new rules are a welcome first step but more needs to be done to protect vulnerable children from content that could lead to self-harm or suicide, and also the algorithms that push them to such material.
"As a frontline child psychiatrist, I'm seeing more and more young people affected by harmful online content.
"The Government must make it illegal for companies to allow or promote self-harm content and ensure that breaking the law leads to stiff penalties.
"The Government should rethink their plans and work to include a proposal to compel social media companies to hand over anonymised data to researchers in the legislation."
More widely, the move has been welcomed for stepping up efforts against dangerous material such as terrorism and child sexual abuse.
"I am pleased that the Government has finally confirmed that it plans to introduce a new duty of care," said Anne Longfield, Children's Commissioner for England.
"The signs are that this regulation will have teeth, including strong sanctions for companies found to be in breach of their duties, and a requirement on messaging apps to use technology to identify child abuse and exploitation material when directed to by the regulator.
"However, much will rest on the detail behind these announcements, which we will be looking at closely."
Susie Hargreaves, chief executive of the Internet Watch Foundation charity which works to remove child sexual abuse images and videos from the internet, warned that regulation alone will not be enough to tackle the problem.
"We welcome the move to introduce a new duty of care and the interim code of practice for tech companies," she said.
"Regulation alone won't solve this problem. We need to make sure global efforts to fight child sexual abuse and exploitation material are linked up, and that there is an international response to this most international of threats."
NSPCC chief executive Peter Wanless said the charity will be "closely scrutinising the proposals", while Barnardo's chief executive Javed Khan warned "the devil is in the detail".