See the Complete Picture.
Published loading...Updated

Identifying and mitigating algorithmic bias in the safety net

Summary by Nature
Algorithmic bias occurs when predictive model performance varies meaningfully across sociodemographic classes, exacerbating systemic healthcare disparities. NYC Health + Hospitals, an urban safety net system, assessed bias in two binary classification models in our electronic medical record: one predicting acute visits for asthma and one predicting unplanned readmissions. We evaluated differences in subgroup performance across race/ethnicity, se…

Bias Distribution

  • 100% of the sources are Center
100% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Nature broke the news in United Kingdom on Thursday, June 5, 2025.
Sources are mostly out of (0)