Legal action launched over sham marriage screening algorithm

  • Published
Silhouette of a couple from behind holding hands

The Home Office is facing a legal challenge over an algorithm it uses to identify potential sham marriages, the BBC can reveal.

The system identifies couples suspected of getting married just to get round immigration controls, and refers them for investigation by officials.

Public Law Project (PLP), a legal charity, says it could discriminate against people from certain countries.

The Home Office declined to comment on the specifics of the challenge.

The automated system is run by a special unit within the Home Office charged with cracking down on fake marriages and civil partnerships.

The legal challenge is the latest case to highlight the increasing use of data-driven methods within government for law enforcement purposes.

PLP has sent a pre-action letter to the Home Office outlining its concerns over the algorithm, the first stage in a potential court challenge over whether it is lawful.

The department began using the algorithm, which was designed to make the investigation process more efficient and reduce costs, in 2019.

It is applied to marriage applications involving someone who is not a British or Irish citizen, and lacks sufficient settled status or a valid visa.

The algorithm assesses couples against eight "risk factors". These include the age difference between the couple and any shared travel history, although the full list of criteria has not been disclosed.

Couples deemed to "fail" the process are flagged for further investigation. This can see couples interviewed by immigration officials, or asked to provide documents to show they are in a genuine relationship.

'Home Office secrecy'

According to freedom of information disclosures to PLP, the Home Office has denied the algorithm uses nationality as a factor.

An equality assessment disclosed to the charity, however, shows Bulgarian, Greek, Romanian and Albanian people are more likely to be referred for investigation.

The charity says this shows it could be indirectly discriminating against people on the basis of nationality - something that requires justification in order to be lawful.

It has also questioned whether officials at the department have proper oversight of the algorithm, so they can decide an investigation should not be launched.

PLP's legal director Ariane Adam said "Home Office secrecy" meant couples were in the dark over how their marriage applications were being processed.

"Without that understanding, they are unable to seek remedy when things go wrong, such as when they are impacted by unlawful discrimination," she said.

"New technology can achieve greater efficiency, accuracy, and fairness in government decision-making.

"But if the computers merely replicate biased information, then all they are doing is making prejudicial and unfair decisions faster than humans can."

In a statement, the Home Office said it took abuse of immigration routes "very seriously".

"We will not hesitate to take enforcement action against individuals found to be in a sham marriage or civil partnership including cancelling their leave and removing them from the UK," a spokesperson added.

Algorithm concerns

The Home Office has declined requests from the PLP to disclose more information about the criteria used by the algorithm, on the grounds it could undermine attempts to crack down on sham marriages.

The group is challenging this decision as part of its wider campaigning for more transparency in the use of automated decision-making by public bodies.

The use of such techniques sprang to prominence in 2020, when an algorithm used to determine A-level and GCSE results was scrapped after a public outcry.

Research by PLP has suggested their use is especially prominent in fields including welfare, immigration and housing.

In 2020, the Home Office suspended the use of an algorithm applied to decide visa applications, after a legal challenge from a different charity alleged it was biased against certain nationalities.

The Centre for Data Ethics and Innovation, a government advisory body, has previously said public bodies should be more open about how they use algorithms to make decisions.

In November 2021, it developed a standardised template for public bodies to voluntarily record how they use algorithms. It is reviewed every six months.

Related topics