Biased Hiring Software

Biased Hiring Software

The third technology with which we want to illustrate the political dimension of technology are automated hiring systems. Nowadays, software takes over many steps in the hiring process to make it more efficient. In order to go through more data faster to find the right candidate, companies increasingly rely on machine-learning algorithms.

However, just as with facial recognition software, the new challenge that’s coming up is algorithmic fairness. As machine-learning algorithms are increasingly introduced in areas where personal data is processed, algorithmic fairness also becomes increasingly important.

This becomes more apparent when we look at a study of the US labor market that has shown that African-American names are systematically discriminated against. On the contrary, white names receive a higher number of callbacks for interviews.

When introducing algorithms into such environments, it has been shown that the prevailing social inequalities are not eradicated but rather reproduced. For example, as the video above has shown, a hiring-algorithm which was introduced and used by Amazon between 2014 and 2017 discriminated applications that included the word “women.” Another research “demonstrated that Facebook’s housing and employment ads delivery follows gender and race stereotypes.”

References:

  1. https://www.fastcompany.com/40566971/the-potential-hidden-bias-in-automated-hiring-systems
  2. https://www.weforum.org/agenda/2019/05/ai-assisted-recruitment-is-biased-heres-how-to-beat-it/
  3. https://www.nber.org/papers/w9873