Notice: Undefined variable: post in /home/insatprecm/www/wp-content/themes/insatpress2019/amp-single.php on line 12

Notice: Trying to get property of non-object in /home/insatprecm/www/wp-content/themes/insatpress2019/amp-single.php on line 12
Episode 6: A “misogynist” AI recruiting tool showed bias against women – Insat Press

Sciences et technologie

Episode 6: A “misogynist” AI recruiting tool showed bias against women

Published

on


Notice: Undefined variable: post in /home/insatprecm/www/wp-content/themes/insatpress2019/amp-single.php on line 116

Notice: Trying to get property of non-object in /home/insatprecm/www/wp-content/themes/insatpress2019/amp-single.php on line 116

Notice: Undefined variable: post in /home/insatprecm/www/wp-content/themes/insatpress2019/amp-single.php on line 117

Notice: Trying to get property of non-object in /home/insatprecm/www/wp-content/themes/insatpress2019/amp-single.php on line 117

When we try to define Artificial Intelligence, we can simply describe it as « the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. » Nevertheless, by thinking like a human, can a machine become one? i.e can it develop human characteristics? Can it have emotions, opinions, or judgmental behaviour? Is it possible, for example, to judge people according to criteria not mentioned in the algorithm?In 2014, a team at Amazon’s Edinburgh office created an AI program that automatically sorts through CVs and picks out the most promising candidates. But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way: the new recruiting engine “did not like women”.The reason why Amazon’s program penalized any resumes that contained the word “women’s”, is the fact that the system was trained on data submitted by people over a 10-year period, most of which came from men.

In fact, it’s a reflection of male dominance across the tech industry over the past 10 years as shown in the following graph.

 

When Amazon machine-learning specialists uncovered the big problem, they edited the programs in an attempt to fix the bias. However,  according to them, there was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory. As a result, Amazon lost faith in the AI recruitment tool’s ability to be neutral and abandoned the project altogether. 

Despite the big problem Amazon has faced with the recruitment engine, other companies are forging ahead, underscoring the eagerness of employers to harness AI for hiring. Microsoft Corp’s MSFT.O LinkedIn, the world’s largest professional network, has gone further. It offers employers algorithmic rankings of candidates based on their fit for job postings on its site.

But again, what prevents the system from teaching itself to prefer male candidates over female ones? “How to ensure that the algorithm is fair, how to make sure the algorithm is really interpretable and explainable – that’s still quite far off,” said Nihar Shah, a computer scientist who teaches machine learning at Carnegie Mellon University.

Using technology to widen the hiring net and reduce reliance on subjective opinions of human recruiters is every employer’s dream. But can technology really be objective? Well, from what we’ve seen thus far, there is still much work to do.

Resource:

Amazon scraps secret AI recruiting tool that showed bias against women

Made with ❤ at INSAT - Copyrights © 2019, Insat Press