'No arrests from false facial recognition alerts'

Live facial recognition cameras sit on top of a police van in Croydon.Image source, PA Media
Image caption,

The Met said more than a quarter of people arrested using the technology between September 2024 and September 2025 were involved in violence against women and girls

  • Published

The Metropolitan Police has said it will be "scaling up" its use of Live Facial Recognition (LFR) technology, as it reported no arrests off the back of a false alert in the past 12 months.

Between September 2024 and September 2025, 962 people were arrested following LFR deployments, the force said.

While no one was arrested following a false alert, 10 people - of which eight were black - were falsely alerted by the system. Four were not stopped and the rest were spoken to by officers for under five minutes.

Lindsey Chiswick, from the Met, said the technology was a "powerful and game-changing tool", but human rights groups have raised concerns about privacy and the potential for false matches.

In a report published by the Met Police on Friday, external, it said LFR deployments had led to more than 1,400 arrests in total, of which more than 1,000 people had been charged or cautioned.

These included people wanted by police or the courts, as well as offenders who were in breach of court-imposed conditions, such as sex offenders or stalkers.

More than a quarter of those arrests were for people involved in violence against women and girls, including those suspected of rape, strangulation and domestic abuse, the force said.

The report added that following a survey from the Mayor's Office for Policing and Crime, 85% of respondents backed its use to locate serious and violent criminals, those wanted by the courts, and those at risk to themselves.

Shaun Thompson stands across the road from an entrance to London Bridge tube station.
Image caption,

Shaun Thompson wrongly identified by LFR cameras as a suspect in February 2024

The campaign group Big Brother Watch is bringing a legal challenge against the Met Police's use of the technology, alongside Shaun Thompson, who was wrongly identified by an LFR camera in February 2024.

Mr Thompson previously told the BBC his experience of being stopped had been "intimidating" and "aggressive".

Responding to the Met's report, Jasleen Chaggar, legal and policy officer at Big Brother Watch, said: "It is alarming that over three million people have been scanned with police facial recognition cameras in the past year in London alone.

"Live facial recognition is a mass surveillance tool that risks making London feel like an open prison, and the prospect of the Met expanding facial recognition even more across the city is disproportionate and chilling.

"The Met's report shows that the majority of people flagged by facial recognition were not wanted for arrest."

Ms Chaggar said it was "disturbing that 80% of the innocent people wrongly flagged by facial recognition were black".

"We all want police to have the tools they need to cut crime but this is an Orwellian and authoritarian technology that treats millions of innocent people like suspects and risks serious injustice," she said.

"No law in this country has ever been passed to govern live facial recognition and given the breath-taking risk to the public's privacy, it is long overdue that the government stops its use to account for its serious risks."

Live facial recognition cameras on top of a van in Croydon. Officers are standing close to the van, and people walk by in coats. Image source, PA Media
Image caption,

The Met said it would be "scaling up" the use of the technology

The Met said that although eight out of 10 false alerts involved individuals from black ethnic backgrounds, it was "based on a very small sample size".

"Overall, the system's performance remains in line with expectations, and any demographic imbalances observed are not statistically significant," it said in its report, adding that: "This will remain under careful review."

The force said LFR had a low false alert rate of 0.0003% from more than three million faces scanned.

Following the report the force has said it will be "building on its success" by increasing deployments each week.

A police sign stating: Police live facial recognition in operation. Image source, PA Media
Image caption,

The Met has regularly told the public that their biometric data is deleted if no alerts are triggered

Ms Chiswick, the lead for LFR at the Met and nationally, said: "We are proud of the results achieved with LFR. Our goal has always been to keep Londoners safe and improve the trust of our communities. Using this technology is helping us do exactly that.

"This is a powerful and game-changing tool, which is helping us to remove dangerous offenders from our streets and deliver justice for victims.

"We remain committed to being transparent and engaging with communities about our use of LFR, to demonstrate we are using it fairly and without bias."

If someone walks past an LFR camera and is not wanted by the police, their biometrics are immediately and permanently deleted, the Met Police said.

Listen to the best of BBC Radio London on Sounds and follow BBC London on Facebook, external, X, external and Instagram, external. Send your story ideas to hello.bbclondon@bbc.co.uk, external

Related internet links