King's Cross facial-recognition plans revealed by letter
- Published
King's Cross Central's developers said they wanted facial-recognition software to spot people on the site who had previously committed an offence there.
The detail has emerged in a letter one of its managers sent to the London mayor, on 14 August. Sadiq Khan had sought reassurance, external using facial recognition on the site was legal.
Two days before, Argent indicated it was using it to "ensure public safety".
On Monday, it said it had now scrapped work on new uses of the technology, external.
But its earlier, "limited" facial-recognition system had been in use until March 2018.
In the August letter, Argent partner Robert Evans wrote: "We introduced a limited facial-recognition capability alongside our CCTV system in 2015, limited to two cameras along King's Boulevard, to help deploy and direct our estate teams.
"However, ongoing construction work in the areas means that this system has not been operational continually since that date.
"The facial-recognition system is not currently operational and has not been so for over 12 months."
The Home Office's Surveillance Camera Code of Practice, external says "there must be as much transparency in the use of a surveillance camera system as possible" and "any use of facial recognition or other biometric characteristic recognition systems needs to be clearly justified".
The code was drawn up to address prior concerns about private-sector organisations sharing access to surveillance technology with the police.
Argent has said it used the system to "help the Metropolitan Police and British Transport Police prevent and detect crime".
However, a British Transport Police spokesman told BBC News it had never "contributed, nor has benefited, from any trial of facial recognition technology in the Kings Cross estate".
'Blurred out'
The letter goes on to reference plans for a new "upgraded system".
"[It] is designed to run in the background, effectively dormant unless it matches against a small number of 'flagged' individuals (for example, individuals who have committed an offence on the estate or high risk-missing persons).
"At this point, all other faces are automatically blurred out when the footage is played back or captured.
"The system does not does not store the facial images of others."
The letter makes clear Argent was already "in the process of installing" the upgraded system at the time of writing.
Monday's statement, however, had referred only to "work on the potential introduction of new FRT [facial recognition technology]".
The letter goes on to say Argent had been audited by an independent company to ensure it was compliant with GDPR (General Data Protection Regulation), the privacy law that came into effect last year.
And it said it intended to work with the Information Commissioner's Office before turning on its upgraded system to ensure it was fully compliant with the law.
The ICO has made it clear any software that recognises a face in a crowd and then matches it against a database of people, external counts as the processing of personal data, whether or not the faces are subsequently blurred out.
It launched an investigation into the use of live facial-recognition technology at King's Cross on 15 August and has yet to report its findings.
Argent has said it has "no plans to reintroduce any form" of facial-recognition technology at the site at this time.
"It is an absolute scandal that this has been going on, apparently in secret, in the centre of London for years," commented Silkie Carlo from the privacy campaign group Big Brother Watch.
"We hope the ICO takes the most robust available action."
- Published2 September 2019
- Published12 August 2019
- Published12 July 2019