King's Cross developer defends use of facial recognition
- Published
The developer behind a 67-acre site in the King's Cross area of central London has defended its use of facial recognition technology.
Under data protection laws, firms must provide clear evidence that there is a need to record and use people's images.
A spokeswoman said the tool was used to "ensure public safety" and was one of "a number of detection and tracking methods".
The local council said it was unaware that the system was in place.
It was first reported by the Financial Times., external
In a statement, developer Argent said it used cameras "in the interest of public safety" and likened the area to other public spaces.
"These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public," it said.
A spokeswoman declined to say what those systems were, how long the facial recognition had been in operation or what the legal basis was for its use, as is required under European data protection law.
Potential for inappropriate use
In addition to the National Rail, London Underground and Eurostar stations, King's Cross is home to a number of restaurants, shops and cafes, as well as offices occupied by Google and Central Saint Martins college.
The college told the BBC it had "not been made specifically aware" that the tech was in use in the area and added that it does not use it inside its own buildings.
According to the King's Cross website, planning permission for new additions to the site, granted in 2006, included:
50 buildings
1,900 homes
20 streets
10 public parks
The BBC has confirmed that London's Canary Wharf is also seeking to trial facial recognition tools, as reported in the Financial Times.
The Information Commissioner's Office (ICO) said it had general concerns about the potential for inappropriate use of the technology.
"Organisations wishing to automatically capture and use images of individuals going about their business in public spaces need to provide clear evidence to demonstrate it is strictly necessary and proportionate for the circumstances, and that there is a legal basis for that use," it said in a statement.
"The ICO is currently looking at the use of facial recognition technology by law enforcement in public spaces and by private sector organisations, including where they are partnering with police forces.
"We'll consider taking action where we find non-compliance with the law."
South Wales Police faced a legal challenge to its use of facial recognition in 2018.
Despite this it is currently undergoing a three-month trial of a new app.
Chancellor Sajid Javid gave his backing to the police in their trials of facial recognition cameras last month, while he was home secretary.
However, privacy groups have also voiced concerns about the implications of facial recognition on privacy rights.
"Facial recognition is nothing like CCTV - it's not an accurate comparison," said Stephanie Hare, an independent researcher and tech commentator.
"It allows us to be identified and tracked in real time, without our knowledge or our informed consent.
"We recognise the power of DNA and fingerprints as biometrics and their use is governed very strictly under UK law. We do not apply the same protections and restrictions to face, yet it is arguably even more powerful precisely because it can be taken without our knowledge."
- Published12 July 2019
- Published7 August 2019
- Published7 June 2019