IE 11 is not supported. For an optimal experience visit our site on another browser.

Racial bias found in widely used health care algorithm

An estimated 200 million people are affected each year by similar tools that are used in hospital networks

A widely used health care algorithm that helps determine which patients need additional attention was found to have a significant racial bias, favoring white patients over blacks ones who were sicker and had more chronic health conditions, according to research published last week in the journalScience.

The bias was detected in the health services company Optum’s algorithm, but researchers say it is only one data-driven service of many that perpetuates disparities in medical treatment. An estimated 200 million people are affected each year by similar tools that are used in hospital networks, government agencies and health care systems nationwide, the study noted.

“The risk is that biased algorithms end up perpetuating all the biases that we currently have in our health care systems,” said Ziad Obermeyer, an acting associate professor at the Berkeley School of Public Health who was the lead researcher on the study. “It furthers the vicious cycles that we all want to break.”

Image: Ziad Obermeyer
Ziad Obermeyer.Courtesy of Ziad Obermeyer

The algorithm used heath costs to predict and rank which patients would benefit the most from additional care designed to help them stay on medications or out of the hospital. The study looked at more than 6,000 self-identified blacks and nearly 44,000 self-identified whites.

Patients above the 97th percentile were marked as high-risk and automatically enrolled in the health program, yet the black patients had 26.3 percent more chronic health conditions than equally ranked white patients.

“We already know that the health care system disproportionately mismanages and mistreats black patients and other people of color,” said Ashish Jha, director of the Harvard Global Health Institute. “If you build those biases into the algorithms and don’t deal with it, you’re going to make those biases more pervasive and more systematic and people won’t even know where they are coming from.”

Optum’s algorithm harbored this undetected bias despite its intentional exclusion of race. This is because inequity is baked into algorithms when they’re built on biased data, Jha said.

Cost is not a “race-blind” metric, and using it as a screening tool for high-risk patients led to the disparity the researchers found in Optum’s algorithm because, for one reason, Obermeyer said, black patients access health care less than white, wealthier patients do. Black patients spent $1,800 less in medical costs per year than white patients with the same chronic conditions, leading the algorithm to conclude incorrectly that the black patients must be healthier since they spend less on health care.

Optum, based in Eden Prairie, Minnesota, said in a statement that it appreciated “the researchers’ work, including their validation that the cost model within Impact Pro was highly predictive of cost, which is what it was designed to do.”

But Obermeyer said that “simply because you left the race variable out of the model does not guarantee by any means that your algorithm will not be racist.”

“When we’re making these algorithms, we make these choices that seem technical and small, they’re deeply important impacts — both positive and negative — on people’s lives,” he said.

The causes of this cost disparity are convoluted and various, Obermeyer said. But at their roots is the disproportionate levels of poverty that black families and individuals face, he said.

From lack of access to transportation to competing demands at jobs, poverty produces a variety of conditions that make black people less likely to access health care, Obermeyer said.

“You would hope that people would recognize that there are a lot of factors that would keep different populations from either utilizing care or being able to access care, and built that in the system,” said Caitlin Donovan, spokesperson for the National Patient Advocate Foundation.

Once black patients do access care, their treatment can be affected by overt or subconscious discrimination, Obermeyer said. Some physicians have negative perceptions of black patients in terms of intelligence, pain tolerance and behavioral tendencies, according to research.

Black patients are prescribed less pain medication than white patients with the same complaints and receive fewer referrals for cardiovascular procedures. Black women are also three to four times more likely than white women to die from pregnancy-related causes.

“The system leads to differential outcomes, and we’re all responsible for that,” Jha said. “People need to understand this for what it is, which is systemic bias we need to root out.”

But Obermeyer is optimistic about the future of data-driven health care. Research like his can help root out and eliminate bias from medical algorithms, which Optum has already endeavored to do.

The health services company replicated the study on a data set of 3.7 million people in coordination with Obermeyer. A new algorithm, which uses health prediction in conjunction with cost, saw an 84 percent reduction in bias. If fixed, the amount of black patients served by the algorithm would increase from 17.5 percent to 46.5 percent.

“The tool applies complementary analytics from over 600 clinical measures to identify gaps in care based on well-established, evidence-based clinical guidelines,” Optum said. “These gaps, often caused by social determinants of care and other socio-economic factors, can then be addressed by the health systems and doctors to ensure people, especially in underserved populations, get effective, individualized care.”

While human bias can be challenging to quantify and diminish, the bias in algorithms is far easier to eradicate, Jha noted. He believes that with the right application, algorithms could even lessen the impact of discrimination that has long plagued the medical field.

“Algorithms that are built well with these issues taken into account can help doctors overcome subtle unconscious biases they might have,” Jha said. “Data and algorithms have a lot of potential to do good, but what this study reminds us of is that if you don’t do it right, you have a lot of potential to do harm.”

“Once you understand the bias in the algorithm, not only do you understand the bias in the humans that shaped the algorithms, but you also have a roadmap to fixing it,” Obermeyer said.