Amazon AI RECRUITER
2015
Amazon built the ultimate recruitment screening tool. AI magic that could screen resumes and automatically rank the best candidates. The goal was to make hiring like everything else at Amazon: data-driven and efficient.
To teach the AI what good candidates looked like, the AI tool was fed with ten years' worth of their own hiring data. The model analyzed the resumes of past applicants and learned the patterns of successful hires. Unfortunately, it also learned something else.
With all the data from Amazon’s male-dominated tech hires, AI taught itself that men were preferable candidates. It learned that men were better than women. The system began penalizing resumes containing the word "women's" (such as "women’s soccer club") and downgraded graduates from women's colleges.
Amazon's engineers tried to edit the program to make it neutral, but they could not prevent it from inventing new ways to discriminate. They had built a perfect machine for automating past biases.
Remember kids: AI is only as objective as the flawed human world it learns from.