Latest News

Employment discrimination? Blame it on job recruiting sites

Data science consultant Cathy O’Neil helps companies audit their algorithms for a living. And when it comes to how algorithms and artificial intelligence can enable bias in the job hiring process, she said the biggest issue isn’t even with the employers themselves.


A new law in the US that aims to help job seekers understand how AI tools are used to evaluate them in video interviews recently resurfaced the debate over AI’s role in recruiting. But O’Neil believes the law tries to tackle bias too late in the process.


“The problem actually lies before the application comes in. The problem lies in the pipeline to match job seekers with jobs,” said O’Neil, founder of O’Neil Risk Consulting & Algorithmic Auditing. That pipeline starts with the job and social sites where algorithms can play a significant role in determining which candidates see which job postings, filtering out those deemed unqualified. Here’s how she says bias shows up in the hiring process:


AI hiring tools are far from perfect


While algorithms may speed up the process of narrowing the pool of job candidates, they are often not great at finding the most qualified ones, and instead, end up disproportionately filtering out people in those exact categories.


In 2018, Amazon shut down a tool it had built to automate its hiring using artificial intelligence because it was biased against women. Researchers have also shown how AI tools that analyse video interviews are often biased against people with disabilities.


Bad data in, bad data out


There are several reasons why algorithms can end up discriminating against certain groups. Programmers ‘train’ an algorithm by showing it a massive set of historical data. In the case of a job site, they show it information about past candidates, telling it to look for patterns among people who ultimately got jobs, which it then uses to identify potential candidates with those same qualities. That can lead to problems, however, if the dataset is already skewed.


Big data means biased noise


The other issue gets at why O’Neil believes biased job sites are particularly problematic: They factor in information that may have no bearing on a candidate’s ability to do a job, rather than focusing only on relevant details.


Sites like Facebook and LinkedIn etc use a wide range of demographic information to train their algorithms. Those algorithms then help determine which job ads are shown to which candidates as well as which candidates appear in recruiters’ search results. Even if that information isn’t explicitly about a candidate’s race or gender, it can still lead to racist or sexist results.