Technology
Amazon’s sexist recruiting algorithm reflects a larger gender bias
AI may have sexist tendencies. But, sorry, the problem is still us humans.
Amazon recently scrapped an employee recruiting algorithm plagued with problems, according to a report from Reuters. Ultimately, the applicant screening algorithm did not return relevant candidates, so Amazon canned the program. But in 2015, Amazon had a more worrisome issue with this AI: it was down-ranking women.
The algorithm was only ever used in trials, and engineers manually corrected for the problems with bias. However, the way the algorithm functioned, and the existence of the product itself, speaks to real problems about gender disparity in tech and non-tech roles, and the devaluation of perceived female work.
Amazon created its recruiting AI to automatically return the best candidates out of a pool of applicant resumes. It discovered that the algorithm would down-rank resumes when it included the word “women’s,” and even two women’s colleges. It would also give preference to resumes that contained what Reuters called “masculine language,” or strong verbs like “executed” or “captured.”
These patterns began to appear because the engineers trained their algorithm with past candidates’ resumes submitted over the previous ten years. And lo and behold, most of the most attractive candidates were men. Essentially, the algorithm found evidence of gender disparity in technical roles, and optimized for it; it neutrally replicated a societal and endemic preference for men wrought from an educational system and cultural bias that encourages men and discourages women in the pursuit of STEM roles.
For clues on why there are so few women in tech, watch a recruiting session for college seniors. Few women engineers, even fewer speak, men talking over them https://t.co/8CmfAB9J3Q via @jessiwrites
— Scott Thurm (@ScottThurm) March 1, 2018
Amazon emphasized in an email to Mashable that it scrapped the program because it was ultimately not returning relevant candidates; it dealt with the sexism problem early on, but the AI as a whole just didn’t work that well.
However, the creation of hiring algorithms themselves — not just at Amazon, but across many companies — still speaks to another sort of gender bias: the devaluing of female-dominated Human Resources roles and skills.
According to the U.S. Department of Labor (via the consulting firm Visier), women occupy nearly three fourths of H.R. managerial roles. This is great news for overall female representation in the workplace. But the disparity exists thanks to another sort of gender bias.
There is a perception that H.R. jobs are feminine roles. The Globe and Mail writes in its investigation of sexism and gender disparity in HR:
The perception of HR as a woman’s profession persists. This image that it is people-based, soft and empathetic, and all about helping employees work through issues leaves it largely populated by women as the stereotypical nurturer. Even today, these “softer” skills are seen as less appealing – or intuitive – to men who may gravitate to perceived strategic, analytical roles, and away from employee relations.
Amazon and other companies that pursued AI integrations in hiring wanted to streamline the process, yes. But automating a people-based process shows a disregard for people-based skills that are less easy to mechanically reproduce, like intuition or rapport. Reuters reported that Amazon’s AI identified attractive applicants through a five-star rating system, “much like shoppers rate products on Amazon”; who needs empathy when you’ve got five stars?
In Reuters’ report, these companies suggest hiring AI as a compliment or supplement to more traditional methods, not an outright replacement. But the drive in the first place to automate a process by a female-dominated division shows the other side of the coin of the algorithm’s preference for “male language”; where “executed” and “captured” verbs are subconsciously favored, “listened” or “provided” are shrugged off as inefficient.
The AI explosion is underway. That’s easy to see in every evangelical smart phone or smart home presentation of just how much your robot can do for you, including Amazon’s. But that means that society is opening itself up to create an even less inclusive world. A.I. can double down on discriminatory tendencies in the name of optimization, as we see with Amazon’s recruiting A.I. (and others). And because A.I. is both built and led by humans (and often, mostly male humans) who may unintentionally transfer their unconscious sexist biases into business decisions, and the robots themselves.
So as our computers get smarter and permeate more areas of life and work, let’s make sure to not lose what’s human — alternately termed as what’s “female” — along the way.
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1453039084979896’);
if (window.mashKit) {
mashKit.gdpr.trackerFactory(function() {
fbq(‘track’, “PageView”);
}).render();
}
-
Entertainment7 days ago
How to watch Pharrell’s ‘Piece by Piece’ at home: When is it streaming?
-
Entertainment7 days ago
‘Gladiator II’ review: Ridley Scott grapples with modern masculinity in ancient Rome
-
Entertainment6 days ago
BookTok’s growing rift over politics is heating up
-
Entertainment5 days ago
Trump taps Musk for ‘Department of Government Efficiency’: What it is and what’s at risk.
-
Entertainment5 days ago
Trump appoints Elon Musk to DOGE, a new U.S. government department
-
Entertainment4 days ago
Greatest birthday gifts for men: Practical and posh presents that are sure to please
-
Entertainment5 days ago
Stocking up on holiday gift cards? Watch out for this scam.
-
Entertainment3 days ago
‘Interior Chinatown’ review: A very ambitious, very meta police procedural spoof