- New York State is investigating UnitedHealth Group over its use of an algorithm that researchers found to be racially biased.
- A study in the journal Science just found a widely-used algorithm gave more complex treatment to white patients than sicker black patients.
- The study’s findings point to one of the many risks to implementing more AI in healthcare. Business Insider Intelligence predicts that spending on healthcare AI is projected to grow at an annualized 48% between 2017 and 2023.
- Algorithms also strip doctors and nurses from the autonomy to diagnose and treat patients individually.
- Visit Business Insider’s homepage for more stories.
UnitedHealth Group used technology that may have kept sick black patients from receiving high-quality care.
New York’s state departments of financial services and health sent a letter to UnitedHealth Group over its use of an algorithm that researchers found to be racially biased. Per the Wall Street Journal, the missive is an initial step into a larger investigation.
The algorithm in question, Impact Pro, identifies which patients would benefit from complex health procedures favored treating white patients than sicker black ones between 2013 and 2015, according to a study published in the prestigious journal Science.
New York lawmakers deemed the use of this discriminatory technology “unlawful,” and asked to either demonstrate the algorithm is not biased or to stop using Impact Pro immediately.
“New York will not allow racial bias, especially where it results in discriminatory effects that could mean the difference between life and death for an individual patient and the overall health of an already-underserved community,” Linda Lacewell, superintendent of New York’s department of financial services, and Howard Zucker, commissioner of the department of health, wrote in the letter.
Why Impact Pro’s algorithm may have discriminated against black patients
The algorithm predicted black patients would cost less, which signaled to medical providers that their illnesses must not be that bad. But, in reality, black patients cost less because they don’t purchase healthcare services as much as white people on average.
The study stated black patients don’t seek out healthcare due to a lack of access and a general mistrust in the system. Facing more barriers to accessing healthcare, in turn, indirectly drives down the projected “cost” of illness in black patients.
Health systems use this algorithm on 200 million people each year across the US, the report states. If the algorithm were to eliminate the racial bias, black patients who receive additional help would increase from 17.7% to 46.5%, it predicts.
Optum, UnitedHealth’s multi-billion dollar business that used Impact Pro, previously touted its use of AI to provide better care.
Business Insider reached out to UnitedHealth Group for comment.
The trouble with algorithmic healthcare
AI and algorithms are on the rise in the health industry. Business Insider Intelligence predicts that spending on healthcare AI is projected to grow at an annualized 48% between 2017 and 2023.
Yet experts and researchers have long called out the bias algorithms can perpetuate. Amazon built a hiring tool that discriminated against women. Tweets from black people were more likely to be dubbed “toxic” in a Google-funded AI tool. Facial recognition tools used by the US government have been shown to misidentify black people much more often than white people.
An algorithm is also at risk of reflecting structural inequality, as with zip codes. In the 1930s, the US explicitly segregated African-American neighborhoods from white ones through policies known as “redlining.” Today, these policies have lasting impact on racial makeup on zip codes — which means using these zip codes in algorithms perpetuates racial inequality.
Gerard Brogan, a registered nurse and the director of nursing practice National Nurses United and their California branch, says algorithm takes autonomy from clinicians. While algorithms take the average of patient outcomes to find treatment, most nurses and doctors prefer to provide treatment tailored to each individual.
“Traditionally, both nurses and doctors are independent professionals, but because it’s now an industry, we’re looking at care where algorithms are dictating care rather than professional judgment,” Brogan said. “Bill Gates a few years ago said in 15 years time there will be nurses, there will be no doctors, because no one can out-think a computer,” Brogan said. “Algorithms may beat people at chess, but they don’t hold peoples’ hands.”