Skip to main content

Met police must ditch ‘highly invasive’ facial recognition tech, BLM demands

HUMAN rights groups, including Black Lives Matter UK, are demanding the new Metropolitan Police chief end his force’s use of “inaccurate and highly invasive” facial recognition technology. 

In a letter to Met Commissioner Sir Mark Rowley, sent on his first day on the job today, several major organisations have called on him to ditch the tech, which they claim has an 87 per cent failure rate. 

The force began trialling the use of live facial recognition technology in the capital in 2016, before rolling out the software more widely earlier this year. 

The tech is usually deployed in crowded areas and has been used to scan hundreds of thousands of faces at protests, sporting events and even concerts. 

It works by scanning the faces of everyone in range and comparing them in real time with a database of people on a watch list. 

Signatories of the letter, which also include Liberty and Big Brother Watch, said the tech wrongly matched individuals on the police’s watch list in almost nine in 10 cases. 

Cases of false alerts include that of a 14-year-old schoolboy and a French exchange student who had only been in the country for a few days. 

The campaigners also repeated concerns that the tech is being deployed in areas with a higher proportion of people from ethnic minorities and is even less accurate for women. 

Big Brother Watch director Silkie Carlo, whose organisation has carried out observations of recognition tech deployments, said: “Millions of Londoners’ faces have been scanned by facial recognition cameras without their consent and without many parliamentarians’ awareness.

“If the new commissioner is serious about fighting crime effectively while addressing discrimination in policing, he cannot endorse the use of a technology with an 87 per cent failure rate, that pointlessly drains police resources, and is well known to have issues with racist and sexist misidentifications, many of which we’ve witnessed.”

Liberty director Martha Spurrier warned the Met’s use of the tech was violating people’s rights and threatening their freedoms. 

Police use of live facial recognition technology has been subject to legal actions, with the Court of Appeal ruling in 2020 that South Wales Police’s use of the tech was unlawful and violated peoples’ human rights. 

A Met spokeswoman said the technology has helped to “locate dangerous individuals and those who pose a serious risk to our communities.” 

She added that the Met’s own calculations of the number of false alerts was between 0 to 0.08 per cent.

OWNED BY OUR READERS

We're a reader-owned co-operative, which means you can become part of the paper too by buying shares in the People’s Press Printing Society.

 

 

Become a supporter

Fighting fund

You've Raised:£ 9,944
We need:£ 8,056
13 Days remaining
Donate today