The introduction of automation into the hiring process has put a spotlight on a persistent problem: discrimination in hiring on the basis of protected-class status. Left unchecked, algorithmic applicant-screening can exacerbate pre-existing societal inequalities and even introduce new sources of bias; if designed with bias-mitigation in mind, however, automated methods have the potential to produce fairer decisions than non-automated methods. In this talk, Swati Gupta from Georgia Institute of Technology focuses on selection algorithms used in the hiring process (e.g., resume-filtering algorithms) given access to a "biased evaluation metric". She assumes that the method for numerically scoring applications is inaccurate in a way that adversely impacts certain demographic groups.
This talk was presented at the Harvard Business School Crossing Disciplines: Studying Fairness, Bias, and Inequality in Management and Decision Sciences Research workshop on May 21, 2021.
For more on the Digital Initiative, check out digital.hbs.edu.
3 окт 2024