Met Police gave images for King's Cross facial recognition scans

  • Published
King's Cross
Image caption,

King's Cross Estate's managers say the facial recognition scans ended in 2018

London's Metropolitan Police Service has revealed that it supplied images for a database used to carry out facial recognition scans of people who visited the King's Cross estate.

The force had previously said it had not been involved with the scheme, but now acknowledges this was "incorrect", external.

London's mayor has asked for a report to reveal exactly what data was shared with whom "as a matter of urgency".

The Surveillance Camera Commissioner is also making inquiries.

"In light of this acknowledgement from the MPS I will be contacting senior officers at the force to understand how they were complying with section 33 of the Protection of Freedoms Act and paying due regard to the Surveillance Camera Code of Practice," Tony Porter told the BBC.

The code, external requires there to be "as much transparency" as possible and a clear justification for the use of a facial recognition system.

A spokesman for the Met said it had shared the images "to assist in the prevention of crime" under "a local agreement" made with the King's Cross Estate partnership.

"The MPS has not shared any images with the King's Cross Estate, or any related company, for facial recognition purposes since March 2018," he added.

"This information sharing had occurred at a local level over a limited period and has only just come to light to the central team managing police imagery.

"As a result all local chief superintendents have been communicated with to reinforce that there should be no local agreements or local use of live facial recognition."

On Friday, the British Transport Police acknowledged it too had provided images. Earlier this week, it had said it had not contributed to or benefited from the scheme.

"Between 2016 and 2018, local teams based at Kings Cross worked with our partners to share images of a small number of convicted offenders, who routinely offended or committed anti-social behaviour (ASB) in the area," said a spokesman for the BTP.

"This was legitimate action in order to prevent crime and keep people safe. Understandably, the public are interested in police use of such technologies, which is why we are correcting our position."

Public debate

The King's Cross development covers a 67-acre (0.3-sq-km) area containing shops, offices and leisure activities. It is privately owned but much of the area is open to the public.

The BBC has again asked the developer Argent whether there were any notices on show to tell the public and workers that it was making use of facial recognition tech. But a spokeswoman declined to add anything to the statement it had already issued earlier this week, external.

It said that two facial recognition cameras had been operational until March 2018, but work to introduce a replacement system had been stopped.

The firm's use of the tech is already under investigation by the Information Commissioner's Office, external.

Argent has not publicly disclosed what software it was using to power its system.

Researchers have raised concerns that some systems are vulnerable to bias as they are more likely to misidentify women than men, and darker-skinned people than others. Earlier this week, the Met's most senior officer, Cressida Dick, drew attention to the problem, external.

The latest development comes the same day that the High Court ruled that separate tests of automated facial recognition (AFR) technology by South Wales Police were lawful. The trials had been challenged by a man who had claimed his human rights had been breached when he was photographed while shopping.

One critic of facial recognition technology said that there was now a need for a parliamentary inquiry.

"We need to debate whether we want [automated facial recognition] and if we do, under what conditions and with what safeguards," commented researcher Stephanie Hare.

"The British public has not been given the opportunity to express its views on something that is so inaccurate, so invasive, and so threatens their privacy and civil liberties.

"It is out of control at present and what we have learned today is that the London Met has, at a minimum, not been able to provide correct information."