The Perils of Biased Data
Published article in MedTech Intelligence
By Mike Noshay
Data is one of the most essential resources in health care. It informs and educates care providers so they can make effective, positive decisions that benefit their patients. But can healthcare data be biased?
According to “Quick Safety 23: Implicit bias in health care”, published by the Joint Commission, “There is extensive evidence and research that finds unconscious biases can lead to differential treatment of patients by race, gender, weight, age, language, income and insurance status.” The results of the bias have led to staggering inequities and disparities in healthcare among different racial and ethnic groups.
Whether intentional or unintentional, inherent bias is often built into the ways data is collected, analyzed, interpreted and distributed. In the abstract of a study published in Science magazine, researchers found that “Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias.” This biased data then becomes a barrier to actions, strategies and measurable progress toward health equity.
As a Teach for America Corps member, I served as director of entrepreneurial studies at East Central High School Entre Magnet in Tulsa, Oklahoma. During that time, I saw many underprivileged families get evicted from their homes. Why? Time and again, they were bankrupted by medical bills. In addition, within zip codes just a one-and-a-half miles from my home, there was a 10-year disparity in life expectancy.
I wanted to find out why this was happening, and soon discovered that one reason was bad healthcare data based on entrenched socioeconomic factors that created biased AI in the system. As a result, many people experienced a lack of access to essential care as well as insurance coverage to pay for it.
To see the full article in MedTech Intelligence, click here.