Study Detected Healthcare System Software Racial Bias Against Black People

According to Nature News, a sweeping analysis has found that an algorithm widely used in US hospitals to allocate healthcare to patients has been systematically discriminating against black people.

What We Know:

  • The algorithm that is widely used in US hospitals to allocate health care to patients has been systematically discriminating against black people.
  • The study of the algorithm system was published on October 24 and it concluded that the system was less likely to refer black people than white people who were equally sick to programs that aimed to improve care for patients with complex medical needs.
  • Ziah Ombermyer studies machine learning and healthcare at Berkeley and advised that he and his team stumbled onto the problem while examining the impact of programs.
  • Ombermyer stated that they were surprised to find that people who identified as black were generally assigned lower risk scores than equally sick white people. As a result, the black people were less likely to be referred to the programs that provided more-personalized care.
  • For 43,539 white patients and 6,079 black patients enrolled in the hospital, researchers obtained the algorithm-predicted risk score and came to conclusions that for a given rise blacks had significantly poorer health than their white counterparts.
  • “The algorithms are being trained to find the sickest in the sense for those whose we spend the most money on and they are systemic racial differences in health care in who we spend money on,” UChicago News.

Even thought the hospitals tested the algorithm and know that they are horrible, they still use this method of testing to make decisions everyday.