Kiwi researcher helps reduce racial bias in US public housing

September 21, 2022

Kiwi researcher helps reduce racial bias in US public housing

An AUT researcher has helped beat racial bias that was privileging white US residents over others when public housing is allocated.

A New Zealand researcher has helped develop a machine-learning system to reduce racial bias in the allocation of public housing to the homeless in the United States.

The system replaces a questionnaire of “highly personal questions", says  Professor Rhema Vaithianathan, of Auckland University of Technology.

These included questions about drug use, interpersonal violence, risk-taking, previous evictions and sexual orientation. The answers determined whether or not a person was eligible for a public house.

A study, however, found people weren’t answering truthfully.

“When you ask stigmatising questions from an already stigmatised population they are less likely to answer in a way that further stigmatises them,” says Professor Vaithianathan.

“Unfortunately, the tool wants you to say that you are a drug user because it elevates your score and makes you more likely to get a house.”

African American women, for example, were more likely to under-report mental health issues and time in prison compared to non-black males in the same age group, according to research undertaken at Allegheny County, in Pennsylvania ,while the questionnaire system was being used.

How the machine-learning system works

With the machine learning system, the client only needs to provide their name, date of birth and gender, Professor Vaithianathan says.

“Within seven seconds, [the system] looks up the person’s history in an integrated data warehouse,” she says.

Professor Vaithianathan says the person’s data will reveal whether the person has used public housing before, if they spent time in jail or had to go to a hospital due to mental-health issues.

“[The tool] predicts if that person will, in the next 12 months, have a mental-health crisis, stay a night in jail or have four or more emergency room visits."

If the person doesn’t have any historical data in the system – which only happens with 10 per cent – they will be given a questionnaire to answer, but this time with less stigmatising questions.

If they get high scores due to their history, they are more likely to get a house.

“It’s based on this idea that the person who scores higher is at greater risk,” said the researcher.

The machine learning has now been in use for 12 months in Pennsylvania and post-research analysis is showing promising results.

People at high risk generally have more access to public houses, irrespective of ethnicity, while previously white people with middle or high scores were provided for at higher rates compared to black clients.

After the introduction of the system, racial inequalities disappeared.

LISTEN: Dr Vaithianathan explains the next steps of the system.

Call for Gaza ceasefire boosted by Auckland's International Women's Day

Call for Gaza ceasefire boosted by Auckland's International Women's Day

Grace Symmans March 28, 2024

Rugby numbers continue to kick on

Rugby numbers continue to kick on

Matt Bullock March 26, 2024

Circus skills unite communities in East Auckland

Circus skills unite communities in East Auckland

Fravash Irani March 26, 2024