Police widening use of live facial scanning with no clear legal grounds – peers

27 January 2024, 00:04

Facial Recognition Technology
Facial Recognition Technology. Picture: PA

The Lords’ Justice and Home Affairs Committee has called into question the lawfulness of the use of live facial recognition.

Police use of live facial recognition surveillance is being expanded without clear legal grounds, peers have warned.

The Lords’ Justice and Home Affairs Committee has called into question the lawfulness of the deployment of the technology by forces across England and Wales.

In a letter to Home Secretary James Cleverly published on Saturday, the committee has called for tightened regulation and independent scrutiny of how the equipment is used.

Live facial recognition (LFR) cameras are used by police in some areas to analyse the faces of passersby and search for specific people.

The committee said it acknowledged the technology may be a valuable tool in catching criminals but that it was “deeply concerned that its use is being expanded without proper scrutiny and accountability”.

There are “no rigorous standards or systems of regulation” in place for the deployment of LFR and “no consistency” in approaches to training officers in its use, peers said.

Baroness Hamwee, chairwoman of the committee, said: “Does the use of LFR have a basis in law? Is it actually legal? It is essential that the public trusts LFR and how it is used.

“It is fundamental that the legal basis is clear. Current regulation is not sufficient. Oversight is inadequate.

“Technology is developing so fast that regulation must be future-proofed. Police forces may soon be able to link LFR cameras to trawl large populations, such as Greater London, and not just specific localities.

“We are an outlier as a democratic state in the speed at which we are applying this technology. We question why there is such disparity between the approach in England and Wales and other democratic states in the regulation of LFR.”

Privacy campaigners and politicians have previously called for police to stop using facial scanning technology, citing concerns over human rights and potential for discrimination.

Civil liberties group Big Brother Watch has branded the tool “Orwellian” and suggested that any widening of its use would lack a clear democratic mandate.

But the Government last year announced it was considering expanding its use of the surveillance across forces and security agencies.

The Home Office argues that the technology frees up officers to spend more time out on the beat and working on complex investigations.

A Government spokesperson said: “Facial recognition, including live facial recognition, is a powerful tool that has a sound legal basis, confirmed by the courts. It has already helped the police to catch a large number of serious criminals, including for murder and sexual offences.

“The police can only use facial recognition for a policing purpose, where necessary, proportionate and fair, in line with data protection and human rights laws.”

The National Police Chiefs’ Council said it welcomed the committee’s scrutiny and would consider its recommendations, but that LFR is always used “proportionately and transparently”.

Individual chief constables are also held to account by their police and crime commissioners and mayors who examine operational decisions on LFR, the council added.

NPCC lead for facial recognition and the Metropolitan Police’s director of intelligence Lindsey Chiswick said: “The High Court and the Court of Appeal have previously recognised the existing legal basis for the police to use (LFR) technology – namely under common law in the UK.

“LFR is a tool which helps police to identify wanted individuals and it is always used proportionately and transparently, with communities told when it will be deployed.”

By Press Association

More Technology News

See more More Technology News

Person on laptop

UK cybersecurity firm Darktrace to be bought by US private equity firm

Mint Butterfield is missing in the Tenerd

Billionaire heiress, 16, disappears in San Francisco neighbourhood known for drugs and crime

A woman’s hand presses a key of a laptop keyboard

Competition watchdog seeks views on big tech AI partnerships

A woman's hands on a laptop keyboard

UK-based cybersecurity firm Egress to be acquired by US giant KnowBe4

TikTok�s campaign

What next for TikTok as US ban moves step closer?

A laptop user with their hood up

Deepfakes a major concern for general election, say IT professionals

A woman using a mobile phone

Which? urges banks to address online security ‘loopholes’

Child online safety report

Tech giants agree to child safety principles around generative AI

Holyrood exterior

MSPs to receive cyber security training

Online child abuse

Children as young as three ‘coerced into sexual abuse acts online’

Big tech firms and financial data

Financial regulator to take closer look at tech firms and data sharing

Woman working on laptop

Pilot scheme to give AI regulation advice to businesses

Vehicles on the M4 smart motorway

Smart motorway safety systems frequently fail, investigation finds

National Cyber Security Centre launch

National Cyber Security Centre names Richard Horne as new chief executive

The lights on the front panel of a broadband internet router, London.

Virgin Media remains most complained about broadband and landline provider

A person using a laptop

£14,000 being lost to investment scams on average, says Barclays