calls for legal review of UK age-sensitive selection system for social benefits | Benefits

An automated system that filters welfare claimants for signs that they might be committing fraud or error has based its verdicts in part on claimants’ ages, it has emerged, prompting calls for a review of the legality of the system.

Xantura, a UK tech company that provides “risk-based verification” to around 80 boards and has assessed hundreds of thousands of applicants, previously said it had not provided its algorithm with any information protected by anti-trust laws. -discrimination.

However, its managing director, Wajid Shafiq, has now admitted that he weighs a claimant’s age, which is a characteristic protected by the Equality Act of 2010. This means less favorable treatment of a claimant. a person on the basis of this characteristic amounts to direct discrimination and may be illegal. .

Xantura spoke to the Guardian after the civil liberties campaign group Big Brother Watch obtained a Freedom of Information Act document cache, giving insight into how Xantura’s system works.

It automatically recommends that applicants for tax breaks for riskier housing and municipalities go through tighter checks, which can lead to delayed decisions. It also speeds up applications for allegedly low-risk applicants.

Xantura insists that using age helps reduce fraud and error, speeds up the majority of requests and does not violate equality law, citing a legal exception allowing financial service providers to take age into account. But activists are calling for further examination.

Documents released to BBW showed that Xantura processed data on where people live – including the ethnic mix of their neighborhood – their gender and marital status.

Gender and race are protected characteristics, but Shafiq said “other than age he does not use any other of the protected characteristics.” He said the neighborhood and gender information was only used to check whether the system was working in a biased way after decisions were made.

He declined to confirm what other personal information is fed into the algorithm, saying it could allow requesters to play with the system, but that said information provided by requesters could be used to prevent fraud and errors.

When asked if the algorithm predicted that older or younger people were more likely to commit frauds or mistakes, he replied: “[It is] Not that easy. This is a multivariate model, so various combinations of risk factors must exist to generate fraud or false claims. “

He previously said: “No protected features are used in the RBV model.”

Xantura is one of the many companies helping automate the benefits system, but how “wellness robots” work has been kept under wraps. Applicants are not informed that their applications are subject to algorithmic decision-making and the impact is increasingly worrying.

According to documents released to BBW, Xantura said in a 2012 confidential “model snapshot” that variables “deemed statistically significant” include the type of area a person lives in, as defined in broad categories that partly reflect ethnic makeup. At the time, these groups defined by the Office for National Statistics included “ethnicity central” to describe places that generally have more non-white people than the UK average “especially people of mixed or black ethnicity.”

A 2017 RBV ‘user guide’ published by Salford City Council and written by Xantura’s business partner Northgate listed 66 items of ‘specific data requested by Xantura’ to calculate a risk score including gender , age and disability.

Shafiq said there was “an error in the way the documents were drafted” and that not all of these factors were used to determine the risk posed by an applicant.

“This is a difference between the RBV system and the RBV model and this distinction needs to be made more clearly in these documents. There was a misunderstanding. “

Some of Xantura’s local authority clients have said in public documents that they do not believe the system has an ‘equality impact’ because it does not use protected characteristics, such as age, gender. , race and disability. Xantura provides client boards with a draft template for implementing RBV policies, including performance reporting and approval.

“In our experience, our clients use our draft policies in developing their own policies,” said Shafiq.

“There is a duty to prevent fraud and error,” he said. “If the local authorities decide that we shouldn’t use age in the modeling process, we can remove it. “

Jake Hurfurt, Head of Research and Investigations at BBW, said: “Dozens of councils have abandoned RBV policies without considering the real risk of discrimination that algorithms pose and no one has a clue of the damage that these algorithms could cause because bias and disproportion are not monitored. . “

Andy Mitchell, a benefit seeker who helps others with claims, said: “All of the groups that are usually targeted are again affected by these algorithms – the poorest in society, the people with no money. voice.

Robin Allen QC, a discrimination lawyer who heads AI Law Consultancy, said: “It is unlikely that it is legal to use the protected age characteristic to suggest who might cheat the benefit system. Age is not a good indicator of honesty and should not normally be used as such. “

Shafiq defended the system, saying, “It is very appropriate that we use age and it is very appropriate that we also use other fields because all the data that [a claimant] supplied can be used for fraud and error prevention.

Northgate is part of the Japanese tech giant NEC, and Xantura’s product fits into their income and benefits system. NEC Software Solutions said, “We have no involvement in defining these criteria. “


Source link

UCicago graduate students develop software to avoid facial recognition technology

An open-source software program “Fawkes”, developed by a UChicago research group, can alter images largely imperceptibly to the human eye […]

Growing demand for advanced military communication systems

New York, USA, July 19, 2021 (GLOBE NEWSWIRE) – According to a report published by Research Dive, the global software-defined […]

Leave a Reply

Your email address will not be published. Required fields are marked *