Committed to improving the health and well-being of all people across every state.

Reducing Bias in Algorithms: Spotlight on Pennsylvania

As states continue developing their health equity strategies, an emerging consideration is how commercial algorithms that are used by states’ Medicaid managed care organizations to inform clinical care decisions may exhibit significant bias.  

In June, the World Health Organization (WHO) released a report on Artificial Intelligence (AI) in health and six guiding principles for its design and use, noting that “AI holds great promise for improving the delivery of healthcare and medicine worldwide, but only if ethics and human rights are put at the heart of its design, deployment, and use.” This report accompanies an increasing number of stories detailing AI’s shortcomings, such as a Science study that found that an algorithm commonly used in health systems  disproportionally keeps Black patients from being included in targeted care outreach. The study found that as a result, the system spends less money on caring for Black patients than White patients, as Black people experience inadequate access and engage in less care encounters; therefore, using a cost proxy renders inconclusive results. The Center for Applied Artificial Intelligence, which included the initial authors that completed the Science study, released a playbook that defines processes and tools that can help measure and address bias in algorithms. 

Pennsylvania has taken steps to better understand how to mitigate the racial bias of algorithms. Pennsylvania’s Department of Human Services (DHS) has collected information on how Medicaid Managed Care Organizations (MCOs) are using algorithms, the types of proxies being used, and the outcomes. Preliminary results unveiled inconsistent approaches and processes. Pennsylvania DHS is now leading an effort, in collaboration with the Center for Applied AI and informed by the Center’s framework in its playbook, to work with the MCOs to develop steps to mitigate bias in MCO algorithms and is the first such state to do so in the country.  

At Pennsylvania DHS, addressing algorithm bias is part of a larger strategy to improve equity across all of the agency’s programs and activities, which were recently detailed in the state’s comprehensive Racial Equity Report. The report includes other important equity-focused initiatives, including:  

  • DHS’ requirement that Medicaid physical health MCOs must work towards achieving the National Committee for Quality Assurance (NCQA’s) Distinction in Multicultural Health Care, now Health Equity Accreditation, which recognizes organizations that excel in providing culturally and linguistically appropriate services, through submitting at least annually to DHS a workplan and timeline towards achieving the distinction. 
  • In Pennsylvania’s Medicaid program, DHS created a new equity incentive payment structure targeted to disparities in two measures of maternal and child health.    
  • Work within DHS to stratify outcomes data by race and ethnicity in its other programs, including plans to examine data for gaps and trends by race in physical health, behavioral health, programs serving people with intellectual disabilities and autism, and long-term services and supports programs. 

The oversight in Pennsylvania extends beyond Medicaid; the Pennsylvania Insurance Department is also taking proactive steps in the commercial insurance market by partnering with the Center for Applied AI as well. This combined oversight was felt to be important, as oftentimes, the same algorithms may be used in multiple industries. Although states can lead oversight for Medicaid MCOs, including other players such as private insurers and accreditation entities will be a necessary step to take in addressing systemic inequities.  

Best practices are still formulating, and states have the opportunity to connect with experts to collaborate. The Agency for Healthcare Research and Quality (AHRQ) has issued a Request for Information to inform an AHRQ Evidence-Based Practice Center Program (EPC) evidence review and may inform other activities commissioned by or in collaboration with AHRQ. Experts say that although bias is prevalent, it is not inevitable; the first step to preventing bias is to know what influences it. 

Acknowledgements

Thank you to Douglas Jacobs, Eric Musser, Jodi Manz, Kitty Purington, Salom Teshale, and Hemi Tewarson for their input on this blog.  

Search

Sign Up for Our Weekly Newsletter

* indicates required
Please enter a valid email address.
Areas of Interest