The companies that make and sell employment algorithms now commonly used in hiring processes have been criticised by researchers for a lack of transparency and consistency.

In a study conducted by researchers from Cornell University’s faculty of Computing and Information Science, it was found that the vast majority of vendors keep the details of their employment algorithms under wraps, which undermines efforts to combat bias in automated hiring processes.

This is despite bias being an increasingly high-profile issue in algorithmic recruiting tools. Amazon, for example, experienced considerable heat for developing an unintentionally sexist recruiting tool, despite abandoning the project before it was ever used to evaluate candidates.

The products assessed in this study, however, are being used in real-world hiring processes, and as employment algorithms are considered intellectual property, there is no legal requirement to disclose information about how the algorithms are developed and so what steps are taken to tackle transparency.

And in the case of the vast majority of the 19 vendors assessed in the study, creators of these products generally don’t choose to disclose such information.

Bias in the machine: The transparency challenge in employment algorithms

The researchers found that bias is often not mentioned at all by vendors of employment algorithms, something that they considered a cause for concern.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

“Plenty of vendors make no mention of efforts to combat bias, which is particularly worrying since either they’re not thinking about it at all, or they’re not being transparent about their practices,” said study first author Manish Raghavan, a doctoral student in computer science.

Where vendors have addressed the bias issue, their commentary has often been vague, according to the researchers, using terms such as “fairness” without clarification of what this actually means.

“Calling an algorithm ‘fair’ appeals to our intuitive understanding of the term while only accomplishing a much narrower result than we might hope for,” said Raghavan.

The problem is, the researchers say, that there is no consistent concept of what a fair employment algorithm is – and vendors are ultimately deciding for themselves what defines a biased or non-biased algorithm without external oversight.

Waking up to bias

Despite the concerns about a lack of transparency or consistency among the creators of employment algorithms, the researchers do believe that attitudes towards the issue are beginning to change, and that the matter is starting to be taken seriously.

“I think we’re starting to see a growing recognition among creators of algorithmic decision-making tools that they need to be particularly cognizant of how their tools impact people,” said Raghavan.

The problem is now developing a consistent approach to doing that.

“Many of the vendors we encountered in our work acknowledge this (impact) and they’re taking steps to address bias and discrimination,” he added.

“However, there’s a notable lack of consensus or direction on exactly how this should be done.”

The research paper, Mitigating Bias in Algorithmic Employment Screening: Evaluating Claims and Practices, will be presented in January at the Association for Computing Machinery Conference on Fairness, Accountability and Transparency.


Read more: AI employment algorithm can help refugees find jobs, says David Miliband