Why did Amazon bail on an AI-based recruitment tool?

10/10/2018
Like humans, it seems even computers can show bias.

At least, this was the case at Amazon when machine-learning specialists detected that their recruiting engine was not partial to female candidates, according to Reuters.

The team had been building computer programs since 2014 to review job applicants’ resumes with the goal of automating the search for top talent. However, the company realized this system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way, the report said.

Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men.

Despite editing the programs to make them neutral, there was still no guarantee that the machines would not “learn” other ways to sort candidates that could prove discriminatory. As a result, Amazon disbanded the team by the start of last year, the report revealed.

Amazon’s recruiters looked at the recommendations generated by the tool when searching for new hires, but never relied solely on those rankings, sources familiar with the situation told Reuters.

To read more, click here.
X
This ad will auto-close in 10 seconds