Do Amazon’s Biased Algorithms Spell the End of AI in Hiring?
Amazon recently cancelled an AI-driven recruiting program after it was discovered that it led to bias in hiring recommendations. But does that mean that machine learning has no role in human resources? The data, as you might imagine, is mixed.
Amazon last year cancelled an AI-based hiring program that it started in 2014 after it showed bias against women, according to an October 9 Reuters story. The program utilized machine learning algorithms to automatically rate resumes submitted by people seeking technical jobs at the e-commerce giant, but eventually learned to rate women’s resumes lower, according to the Reuters story.
Approximately 500 models were trained on job function data culled from the Web, according to the story, and eventually they learned how to recognize about 50,000 terms that commonly appear on resumes. The idea was to help speed up the analysis of resumes for recruiters during a period of rapid growth for the company, which Reuters says tripled its employee count to more than 575,000 over the past three years. That corresponds with more than 500 new hires per day.
Amazon’s machine learning models learned to weed out the most common terms relating to developer and other technology job functions, such as knowledge of programming languages. Instead, it assigned more weight to phrases like “captured” and “executed.”
However, those power phrases, it turns out, are more commonly found on male engineers’ resumes, Reuters says. At the same time, the models learned to assign negative weightings to female-specific terms, such as “women’s” and “women’s chess club.” In fact, the models even downgraded graduates of two all-women’s universities, the Reuters story says.
Amazon data scientists were aware of the problem with bias against women at an early stage, and in fact took steps to remedy the situation. However, the data scientists still could not be sure that the models wouldn’t find another way to discriminate against women, according to the Reuters story. What’s more, the models also made poor decisions when it recommended applicants for jobs they were not qualified to hold.
Faced with a machine learning model gone rogue, Amazon essentially killed the program in early 2017. The company says its recruiters never made hiring decisions based on the recommendations, as was the plan. However, they did take the recommendations into account. A much-scaled back version of the system continues to run, but it’s only responsible for weeding out duplicate entries, according to the Reuters story.
Setback for AI
The fact that Amazon, one of the most technically advanced companies on the planet, could not effectively wrangle the AI to work for hiring should be a wake-up call about how difficult this is, says Kurt Heikkinen, president and CEO of Montage, a developer of hiring software and solutions.
Montage recently dug into the state of AI in hiring in a recent report, dubbed “State of AI in Talent Acquisition.” The report, which is based on two surveys, one that gauged 500 job seekers and another taken by 500 hiring professionals, found that, while companies are beginning to adopt AI technology, it has a long way to go.
The biggest takeaway was that 51% of organizations are already using AI in the hiring process. However, it’s being used for relatively low-level tasks, including automating certain activities around sourcing and screening potential candidates and scheduling interviews. Only 34% of talent acquisition (TA) professionals said they use AI in the candidate selection process, while about the same percentage (36) used AI during the interviewing process.
Montage concluded that job candidates just aren’t ready to interact with certain forms of AI. Only 6% of job candidates said they’d be comfortable with facial recognition technology being used during an interview, a strong reaction to the “creepy factor” that AI tech can have. “AI can still inform better, faster and smarter hiring decisions when applied,” Heikkinen says, “but it’s imperative that AI be used to inform hiring decisions, not make them.”
The Amazon episode is a black mark on algorithmic hiring, to be sure. But it won’t spell the end of the practice, according to Arran Stewart, co-founder and CVO of Job.com.
“It is the dawn of a new level of understanding and awareness we all need to have when it comes to the shortcomings of AI,” he tells Datanami. “We have been using this technology for years in recruitment and Amazon has made a bold move to say that it is flawed. The beauty of AI and technology is when it fails at something, it simply means we need to build it in another way to make it work for all.”
Amazon may have failed, but that doesn’t mean that it can’t be done, according to Stewart, who says there’s a level of “specialized knowledge” in the recruiting industry that Amazon does not possess. “There are focused platforms in the recruitment space that are far greater experienced,” he says, “and due to this, Amazon may choose to leave it to the professionals.”