Council algorithms mass profile millions, campaigners say
- Published
Dozens of councils are using privately developed software to "mass profile" benefit claimants, privacy campaigners say.
The algorithms, designed to predict fraud and rent arrears, "treat the poor with suspicion and prejudice", according to Big Brother Watch.
Councils say good use of data can help them target services more efficiently.
But Big Brother Watch has complained to the Information Commissioner about a "Wild West of algorithms".
Using Freedom of Information law, the group requested data on the use of automated tools from more than 400 local authorities and council-owned housing associations in England, Wales and Scotland.
Just over 300 replied and, of these, 86, about one in four, said they had used the algorithms to assess benefit claims within the past three years, with 55 admitting they still used them.
But "automation and algorithms are not all they claim to be", Big Brother Watch says.
Some apparently trivial data is given unwarranted significance, while flawed algorithms can introduce bias and discrimination and may breach data-protection laws, it says.
"Worse, the influence of private tech firms and poor transparency means that risks to people's data rights go unchallenged," the group says.
It says the algorithms have been used to:
secretly assign fraud-risk scores to 540,000 people before they can access housing benefit or council-tax support
process the personal data of 1.6 million people in social housing, to predict rent arrears
predict the likelihood of homelessness, joblessness or abuse in more than 250,000 people
Individuals defined as high (20%) or medium risk (25%) by the software are subjected to extra questioning as their benefit claims are considered, Big Brother Watch says.
'Very unjust'
One woman, who asked not to be named, sent a formal request for her data to her council and was "stunned" to find she had been flagged as medium risk for fraud.
She had been unaware of the risk-scoring process or how her personal data was being used.
"I've been made to go through all my bank statements line by line with an assessor, which made me feel like a criminal," she said.
"Now, I wonder if it's because a machine decided, for reasons unknown, I could be a fraudster.
"It feels very unjust for people like me, in genuine need, to know I'm being scrutinised and not believed."
Fewer councils are now using the algorithms than four years ago, Big Brother Watch says.
And with the rollout of universal credit, automated scrutiny is becoming more centralised.
Data on the 400,000 housing-benefit claimants given the highest fraud-risk scores by locally run systems must be sent to the Department for Work and Pensions for additional computer analysis.
And this includes:
age
gender
amount of capital
number of children
"The scale of the profiling, mass-data gathering and digital surveillance that millions of people are unwittingly subjected to is truly shocking," Jake Hurfurt, Big Brother Watch's head of research and investigations, said.
"We are deeply concerned that these risk-scoring algorithms could be disadvantaging and discriminating against Britain's poor."
Big Brother Watch is asking the Information Commissioner for better regulation and transparency, including a public register of algorithms used for decision-making in the public sector and requiring authorities to conduct privacy and equality assessments before using such software.
The group is also asking benefit recipients to join their campaign, by requesting their risk scores.
The Local Government Association which represents councils in England, said: "Good use of data can be hugely beneficial in helping allocate resources to where they will have the biggest impact and provide insight into the causes of, and solutions to, costly social problems.
"This can, for example, include monitoring and analysing indices of deprivation, census data, take-up of specific services and resident feedback.
"It is important to note that data is only ever used to inform decisions and not to make decisions for councils."
COSLA, which represents councils in Scotland, added: "We need to use the data we have to make sure benefits have the greatest impact in tackling poverty and need, while reducing the small proportion of fraud and error that exists within the system.
"Councils are aware of and comply with data confidentiality law."