Police facial recognition system faces legal challenge

Human heads on measurement grid

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group.

Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases.

Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act.

The Metropolitan Police says the technology will help keep London safe.

The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.

'Very intrusive'

However, it has proved controversial, with one watchdog describing its use in public places as "very intrusive".

Court documents, seen by the BBC, also claim the Home Office has failed in its duty to properly regulate AFR's use.

Manufacturers of the systems say they can monitor multiple cameras in real time "matching" thousands of faces a minute with images already held by the police - often mugshots taken of suspects who have been taken into custody.

However, Big Brother Watch says the Met's own research, published in May, shows that during trials only two genuine matches were made out of 104 system "alerts".

The group also takes issue with the length of time the images gathered by AFR are held.

The Met piloted the system at Notting Hill Carnival in 2016 and 2017, at the Cenotaph on Memorial Sunday, and at Westfield Shopping Centre in Stratford last month.

Further trials are planned. The force says the technology is "an extremely valuable tool".

Meanwhile, in South Wales, police used AFR at least 18 times between May 2017 and March 2018, according to court documents.

Cameras in Cardiff city centre and at a demonstration at an "arms fair" were used to gather the images of members of the public. As of April this year AFR generated 2,451 alerts with only 234 proving accurate.

Police officers stopped 31 people who had been incorrectly identified and asked them to prove their identity.

'Ethical scrutiny'

Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act, including the right to privacy and freedom of expression.

Silkie Carlo, director of the civil liberties group, said: "When the police use facial recognition surveillance they subject thousands of people in the area to highly sensitive identity checks without consent."

"We're hoping the court will intervene, so the lawless use of facial recognition can be stopped. It is crucial our public freedoms are protected," she added.

However, it is likely that the Met and other police forces will welcome the opportunity to argue the case for AFR - and begin to put it on a solid legal footing alongside other unique characteristics stored on databases, such as fingerprints and DNA.

It comes after the body that advises London Mayor Sadiq Khan on policing and ethics last week called on the Met to be more open about the use of AFR - and to set out where and when it will be used before undertaking any further pilots.

Dr Suzanne Shale, who chairs the London Policing Ethics Panel said: "We have made a series of key recommendations, which we think should be addressed before any further trials are carried out.

"We believe it is important facial recognition technology remains the subject of ethical scrutiny."