Identifying and mitigating algorithmic bias in the safety net
Summary by Nature
1 Articles
1 Articles
All
Left
Center
1
Right
Identifying and mitigating algorithmic bias in the safety net
Algorithmic bias occurs when predictive model performance varies meaningfully across sociodemographic classes, exacerbating systemic healthcare disparities. NYC Health + Hospitals, an urban safety net system, assessed bias in two binary classification models in our electronic medical record: one predicting acute visits for asthma and one predicting unplanned readmissions. We evaluated differences in subgroup performance across race/ethnicity, se…
·United Kingdom
Read Full ArticleCoverage Details
Total News Sources1
Leaning Left0Leaning Right0Center1Last UpdatedBias Distribution100% Center
Bias Distribution
- 100% of the sources are Center
100% Center
C 100%
Factuality
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage