Amazon scraps recruiting AI that ‘taught itself to prefer men over women’
The online retail giant has ditched an AI recruiting tool that 'taught itself' to dislike female job candidates
AMAZON has scrapped a recruitment tool that used artificial intelligence to grade job applicants, after it emerged that it was biased against women.
The tool learned to favour men over women after being trained on past CVs written mostly by male candidates.
According to sources to Reuters, Amazon started using the AI-based hiring software in 2014.
The retailing and tech giant wanted to streamline the hiring process for software development and technical roles, with its AI system quickly giving applicants a rating out of five.
"Everyone wanted this holy grail,” one of the anonymous sources said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those."
However, it was in 2015 that human recruiters realised that it unfairly marked down female applicants, penalising them for including such words as 'women's' in their applications (as in 'member of the women's hockey team').
As a result, the company ditched the AI-based recruiting tool at the beginning of 2017 according to Reuters, although the Sun understands that it was discontinued in 2015.
Its anti-woman bias developed because of the way artificial intelligence and machine learning works.
Through machine learning, it was taught how to assess applications by being fed past CVs submitted over a ten-year period.
The vast majority of these CVs were from male candidates (due to the tech industry's historical dominance by men), so the AI's idea of an 'ideal applicant' was excessively defined by male-associated characteristics.
MOST READ IN TECH
The anonymous sources speaking to Reuters stated that Amazon's recruiters never based recruitment decisions on the tool's ratings alone, although they did look at recommendations when sorting through applications.
Amazon isn't the only company to run into trouble when using AI and machine learning in the context of recruitment.
In September, Facebook came under fire when it emerged that its algorithms had been preventing women from seeing certain job advertisements.
And beyond the world of recruitment, a 2016 study that algorithms used in the US to predict future offending rates among convicts was biased against African Americans.
Do you trust AI-based algorithms to make reliable decisions? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . We pay for videos too. Click here to upload yours.