'Facial recognition can make mistakes, it's not a decision-maker'

Live facial recognition by police is due to start in West Yorkshire later in the year
- Published
Later this year, as you walk down the street in West Yorkshire, your face may well be checked against a criminal database.
Live facial recognition (LFR) has been used by some police forces for eight years - but new funding means it is now to be rolled out in more areas.
Last week, the Home Office confirmed that a total of 10 new LFR vans would be deployed across the country, with two of those set to be used in West Yorkshire, according to police.
The government said the technology had been used in London to make 580 arrests in 12 months, including 52 registered sex offenders who breached their conditions.
Campaign groups have said they are worried about how intrusive LFR could prove to be.
But Alison Lowe, West Yorkshire's Police and Crime Commissioner, has told the BBC that photographs and data collected by LFR will not be stored.
Ms Lowe explained: "Those photographs will be inputted into the system. It's using a live feed, and it measures against that police watch list. Other faces get pixelated out automatically.
"The technology is so sophisticated that it just has numbers - even for the people that it recognises, it's just a series of numbers. But then there's a match that will pop up for the police and they'll only see the face of the person on the watch list.
"After that piece of work is finished, when they're going home for the day, all those faces are deleted from the system."

Alison Lowe, West Yorkshire's Police and Crime Commissioner, says there has to be "a lot" of training in connection with the use of LFR
There have been concerns about false matches when LFR is used.
Shaun Thompson, 39, who was wrongly identified as a suspect by LFR last year, and who is now bringing a High Court challenge against the Metropolitan Police, describes live facial recognition as "stop and search on steroids".
Ms Lowe said she was aware that LFR technology could make mistakes and police needed to be clear in how they used it.
"It definitely causes me concern as I've worked with black and brown communities for many years and I hold myself to a very high standard in regard to cases of disproportionality," she said.
"It can make mistakes, and the whole point of live facial recognition is that it's not a decision-maker."
Ms Lowe said mistaken identity was easy to disprove when a human reviewed the matches by LFR technology.
"There's got to be a lot of training associated with this," she said.
"We know the College of Policing have been looking at whether or not bias in relation to ethnicity, race or gender is embedded, and apparently it's neutral as to those things.
"We need to be alive to those risks. We need to be holding the police and criminal justice partners to account."

LFR cameras have been used in London by the Metropolitan Police for some time
West Yorkshire Police declined an interview request about the introduction of LFR vans in the county, but confirmed that two LFR vehicles would be brought into use by the force later this year.
In a statement, human rights organisation Liberty said of the use of LFR technology by police forces: "Any tech which has the potential to infringe on our rights in the way scanning and identifying millions of people does need to have safeguards around its use."
Madeleine Stone, senior advocacy officer at privacy campaign group Big Brother Watch, said LFR was being rolled out "without a proper legal basis".
"There's never been a vote, there's never been a consultation from the public or from parliament," she said.
"So the fact that the government is investing millions in taxpayer's money into this Orwellian and undemocratic technology is a real misstep and we're really, really concerned about it."
Home Secretary and West Yorkshire Labour MP Yvette Cooper has pledged that people's data would remain secure if their images were caught on LFR.
Ms Cooper said: "The overwhelming majority of images are deleted within 0.2 seconds, so for those who are not on the wanted list of serious criminals, those images are not held. They are immediately deleted.
"There do have to be safeguards. We will do a new legal framework. I think this is technology that we do need to make sure is properly used."
Get in touch
Tell us which stories we should cover in Yorkshire
Listen to highlights from West Yorkshire on BBC Sounds, catch up with the latest episode of Look North
Related topics
Related internet links
- Published4 days ago
- Published3 days ago
- Published4 July
- Published6 August