Amazon has fired artificial intelligence that reviewed its resumes for being sexist, according to Reuters

64 commentsTODAY WE TALK ABOUT

Subscribe to Xataka

Receive an email a day with our articles:

Toni Castillo

Amazon machine learning specialists have been building a tool for reviewing job applicants' resumes since 2014. A way to streamline the search for talent.

However, they discovered a big problem: their recruiting engine was sexist. He didn't like women.

This is what Reuters says, after collecting the testimonies of five people familiar with the matter. The idea was that this artificial intelligence-based tool would be able to process a large number of resumes and, after examining them, yield five perfect candidates ready to hire. It wasn't neutral though.

In XatakaThese are the types of key algorithms in the search for artificial intelligence

It discriminated against female software developers, preferring men

In 2015, according to information published by the news agency, from Amazon they realized that their recruitment system did not judge in a gender-neutral manner. It was biased in favor of men when vetting candidates for software developer positions and other technical occupations.

Why? By the database. It seems that the machine learning specialists trained the artificial intelligence tool from patterns that could be observed in the resumes submitted to the company for a decade. And most of them belonged to men; there was the problem.

The problem was the database with which the AI ​​was fed: there was a strong male presence in it

Amazon has fired a job for being sexist the artificial intelligence that reviewed the CVs, according to Reuters

With this background, the system taught itself that male candidates were preferable, so it penalized all those resumes that included female references. Likewise, it also lowered the rating it gave to the candidates —scores ranging from one to five stars— that included studies in female educational centers.

According to the details revealed by Reuters, the company's professionals modified the job recruitment programs to make them neutral to these particular terms that played against the possibilities of the candidates, but apparently did not ensure the absence of bias . These modifications did not guarantee that the system would not devise other forms of discrimination, their sources assured them.

When in doubt, the artificial intelligence of recruitment was fired

The artificial intelligence's data feed, its livelihood to function, must be correctly processed. And in that processing come into play hypothetical corrections that the system may need to know in order to act with good judgment and fairness.

For example, in this case where half a thousand computer models were developed, based on a decade of hiring, the artificial intelligence should have known that one of the intentions was to offer equal opportunities regardless of gender. Something that, however, is not easy.

A modification of the AI ​​selection criteria did not guarantee that it would devise other forms of discriminationIn XatakaFor the glory of Turing, why is it so difficult to define what artificial intelligence is? (Captcha 1x01)

Amazon has refused to comment on the matter, but according to information handled by the news agency, the company dissolved the team that created this artificial intelligence early last year. The upper echelons of the company lost hope in the project and its true usefulness. It was discontinued, although some data would have been used by human recruiters.

This curious and striking case, added to others of similar colors, highlights the limitations of machine learning and the extensive work that lies ahead. It is not easy to exercise an artificial intelligence system or obtain the expected results without errors, however large or small they may be. Even if you have such large budgets and the necessary talent as in the case of Amazon or Google.

Share Amazon has fired artificial intelligence that reviewed its resumes for being sexist, according to Reuters