Amazon, artificial intelligence, female, resources, recruitment, tools, discrimination, company, employees
In 2014, Amazon engineers developed an algorithm to automate hiring processes. The artificial intelligence tool could review applicants' resumes and rate them out of five to help the human resources team decide who should join the company. However, Amazon's team discovered that the algorithm systematically discriminated against female applicants. There are three possible sources of bias : feature selection, training data and cost function, which might have contributed to the hiring AI bias.
[...] Amazon's dataset contained more resumes from male applicants than female applicants. While the specific ratio of male to female is unknown, we can estimate, for illustration purposes, that male made up 75% and female 25%. Any machine learning algorithm needs a large amount of data in order to be trained. As the female dataset is much smaller, the algorithm will not be able to identify as many distinguishable features of that group. Therefore, the algorithm will not be able to distinguish between a female applicant suitable for a position and an unsuitable female candidate. [...]
[...] By optimizing for the avoidance of false positives, the algorithm may make totally random decisions for female candidates but accurate ones for male candidates. Conclusion The case of the Amazon hiring tool showcases that algorithms can replicate and even magnify the biases that exist in society. Fortunately, the Amazon teams stopped using the hiring tool when they realized that the problem was unfixable. Otherwise, whether Amazon's engineers toolmakers intended to discriminate, the company would have infringed anti-discrimination laws. AI-powered tools should be designed and monitored very carefully, as there are many sources of biases. [...]
[...] The training dataset was produced by Amazon managers and composed of past applicants and current employees' resumes. These resumes most likely had human resources appreciations or comments highlighting the qualities of the applicants. A possible source of bias could stem from these comments, for instance, women's academic performance or experiences were undervalued, with regards to their male counterparts. Indeed, if the data set was skewed against women the algorithm will inherit the bias. Another source of bias could be linked to selection bias. The training dataset was composed of resumes received by Amazon in the past. [...]
[...] How Amazon Hiring AI might have developed a bias against female applicants? In 2014, Amazon engineers developed an algorithm to automate hiring processes. The artificial intelligence tool could review applicants' resumes and rate them out of five to help the human resources team decide who should join the company. However, Amazon's team discovered that the algorithm systematically discriminated against female applicants. There are three possible sources of bias: feature selection, training data and cost function, which might have contributed to the hiring AI bias. [...]
[...] The Amazon teams had to solve these problems and while doing so, they may have introduced biases in the algorithm prediction process. As the cost of a false positive (hiring a candidate unsuited for the position) is higher for Amazon, than the cost of a false negative (rejecting a candidate suited for the position), the algorithm may have developed a preference towards false negative. As the dataset of female applicants is smaller, the algorithm will have more difficulty making accurate predictions regarding female resumes. [...]
APA Style reference
For your bibliographyOnline reading
with our online readerContent validated
by our reading committee