Corporates are utilizing Artificial Intelligence (AI) platforms in the interview process to simplify and speed up hiring. However, real world use of AI highlights concerns that discrimination caused by flaws in the technology and associated processes is causing suitable candidates to miss the short list.
The coronavirus pandemic has really impacted the global job market, with nearly all vertical sectors cutting recruitment in their 2020 intake. In particular, if we consider the UK graduate market, the most impacted segments were financial, corporate B2B, media and retail. For example, to put some figures on this, the number of graduate recruits in 2020 were 12.3% lower than the previous year, with the country’s key employers recruiting 3,700 fewer graduates compared to the previous year.
However, this is just the graduate segment, and if we take into account non graduate positions in verticals like retail the implications are far greater. In parallel there are also more candidates applying for each job opening on offer. As an example, an investment banking analyst graduate vacancy in the UK would easily attract two hundred plus applications, with at least 90% of candidates with acceptable upper tier honours degree qualifications, and no doubt the right credentials.
So, the killer questions: How do HR recruitment functions make the selection process as fair as possible, effective, and speedy – and can Artificial Intelligence help?
Has recruitment ever been fair?
With greater focus on Environmental, Social, and corporate Governance (ESG) compliance measures by businesses (including Corporates) globally, with a key focus on diversity and equality, you would have thought the recruitment process in today’s times to be fair for all, irrespective of race, sex and background.
However, independent studies like the one conducted by the Institute for Social and Economic Research at the University of Essex in the UK found that, for example, British ethnic minority graduates were between 5% to 15% less likely to be employed than their white British peers six months after graduation. This is just one research study and there are many others out there. A key contributing factor to this is that it’s not so much about having the right processes and ‘best in class’ ESG compliant metrics in place. But it’s that unconscious bias in the interview process that is a contributing factor.
Surely artificial intelligence in recruitment will resolve all this?
AI in recruitment, particularly the interview screening, aims to provide many benefits on paper. Firstly, meeting KPIs within the business, in terms of time taken and securing a good hire. The focus again is to find the right people that can contribute to a company’s success. Key attributes that AI platforms and associated value-added software can bring include managing the complete initial application process, screening all resumes and application documents. It also provides interview automation that would normally be performed by the recruitment personnel, through the use of video pre-defined questions or chats, personality profiling, and ultimately scoring, ranking and short-listing candidates. Some AI platforms on the market also provide analysis of potential candidates facial movements, accents, word choices and appearance, before ranking them for suitability.
Some of the key components of AI as a whole can include learning, reasoning, problem solving, perception, and language understanding. Overlaying this, AI interview platforms work by comparing interview candidates and responses to given to data sets, and through algorithm score assessments, sometimes analyzing how humans assess candidates.
However, with all the good that Artificial Intelligence brings to society and businesses, it ultimately is based on learning patterns mimicking a human, and in applications like recruitment where unconscious bias in the interview process comes into play, the technology has many flaws, in addition to technological and moral ones.
The BBC broadcasting channel in the UK recently aired a programme where AI was apparently used not in the hiring process but in the firing one by Estee Lauder within its MAC Cosmetics business, and a liable case in currently on-going involving the ex-employees. Other live cases at issue include how race, gender, emotion and regional accents were being misinterpreted by facial analysis technology in the recruitment process.
With these drawbacks in Artificial Intelligence within recruitment, and pressures placed on society in the job market, employers and in particular large corporates looking to streamline processes need a new plan of action to make recruitment fair for all, and to do this they may have to go back to grass roots and address ‘very real’ issues in society and business on the subject of unconscious bias discrimination. Only then can technologies like AI in recruitment make the impact its suppliers claims it makes.