Police warned about using algorithms to decide who's locked up

  • Published
Hands cuffed behind backImage source, Getty Images

Police should not keep suspects locked up because a computer program has told them they are likely to be offenders, a human rights group has told MPs.

Algorithms that predict whether someone is a criminal based on past behaviour, gender and where they live could be "discriminatory", Liberty said.

The human rights group was giving evidence to the Commons science and technology committee.

The MPs are investigating the growing use of algorithms in decision making.

They are concerned businesses and public bodies are relying on computer programs to make life-changing decisions - despite the potential scope for errors and misunderstandings.

Durham Police have already launched a system which uses algorithms to help decide whether to keep a suspect in custody.

The Harm Assessment Risk Tool (HART) uses historical data on offending to classify suspects as low, medium or high risk of offending.

The tool uses information such as offending history, the type of crime a suspect has been accused of, their postcode and gender.

But Silkie Carlo, Senior Advocacy Officer for Liberty, warned such systems should be seen as "at best advisory".

She pointed to evidence from the US which suggested algorithms in the criminal justice system were more likely to incorrectly judge black defendants as having a higher risk of reoffending than white defendants.

Durham Police stress that use of the algorithm's decision is "advisory" and officers can use their discretion.

But Professor Louise Amoore of Durham University warned it can be "difficult" for a human "to make a decision against the grain of the algorithm".

Ms Carlo said Liberty had requested data from Durham Police about how common the use of human discretion was in decisions using algorithms, but the request was rejected.

Concerns were also raised about the lack of diversity in the technology sector contributing to unintentional bias in algorithms.

'Accurate 98% of the time'

Sandra Wachter, a lawyer and researcher in data ethics, AI and robotics at the Oxford Internet Institute said: "The coding community needs to be more diverse. If we only have white, male coders, of course the systems are going to be biased."

The potential for algorithms to improve efficiency and aid decision-making was also recognised.

Professor Amoore said: "Algorithms allow you to do more with less. So if it's about policing decisions it could be how do you allocate a much more scarce policing resource to the front line."

Data for the Harm Assessment Risk Tool was taken from Durham Police records between 2008 and 2012.

The system was then tested during 2013, and the results - showing whether suspects did in fact offend or not - were monitored over the following two years.

Forecasts that a suspect was low risk turned out to be accurate 98% of the time, while forecasts that they were high risk were accurate 88% of the time.

This reflects the tool's built in predisposition - it is designed to be more likely to classify someone as medium or high risk, in order to err on the side of caution and avoid releasing suspects who may commit a crime.