Bias, one of those concepts that at once manages to be completely obvious and yet almost impossible to pin down. We know phrases such as “unconscious bias” and “selection bias” and, in truth, we recognise more than a little bias in ourselves, but what is it and what does it mean in the broader arm of recruitment?
If you were to be really prosaic about the whole thing you could look up a definition in a dictionary and you’d be told that in the first instance a bias is a “predisposition or prejudice”, both of which are night and day, good and bad and tell us relatively little about the moral standpoint of bias. Apply the definition to statistics however and the dictionary reveals a far clearer direction with statistical bias stemming from “systematic distortion” of the data.
And here we land when we’re talking about recruitment; we need to look at both the general and the specific, data-driven nature of bias recruitment and how we are using tech to bypass our biggest roadblock when it comes to unbiased recruitment: ourselves.
So, what are we talking about when we talk about recruitment bias and how would we recognise it? Yes, of course, there is the issue of holding a pre-determined set of values and opinions about someone based on their gender, age and race, and those opinions whether positive or negative would play a huge role in determining whether you recommended someone for a job or not.
But because humans are, by nature, so niche in appearance, looks, accent and attitude, it is entirely possible to be biased for and against almost any aspect of someone’s character, appearance and life experiences and in doing so we draw on our own histories and values system – the variables with bias are seemingly endless and often unpredictable.
The other consideration is, that no matter what we tell ourselves, being biased is hard to “get over”. You might tell yourself you’ll recruit with absolute respect for the specificity of the role but when it comes to it, how can you control your brain from treading well worn neural paths and making assumptions and judgements based on your existing systems of belief? You really can’t.
What do you do to stamp out recruitment bias? You bring in technology of course and specifically, Artificial Intelligence to help the process along. But guess what? Some of those algorithms you’re using: biased. What’s worse because they are used on a larger scale, they can spread their biased inflections further and wider and faster than even a meme on Twitter. There have been reported cases of facial recognition software that struggles to identify and distinguish between faces with darker skin.
Only last year Reuters reported that Amazon had to ditch its own recruitment software because it really, really didn’t like women. This was largely due to the fact that the software recognised there were more applications from men in the male-dominated tech industries and had taught itself to favour applications from men over women. While executives felt confident the problem could at least be neutralised, the programme was eventually scrapped without the guarantee that the software might never be able to fully run with zero bias.
The question remains that can the very human field of Human Resources ever be handed over fully to AI to unravel? The answer, most industry experts seem to favour, is yes. Yes, AI can be used to successfully recruit candidates based entirely on aptitude and sheer suitability for a role and nothing else but (and you knew this was coming) the software must be entirely free of bias to start with and it must not learn to become biased as it analyses increasingly greater sets of data.
Here lies the rub, because while we may have AI that starts off this way, the very brilliant, terrifying fact that it can learn from data and historic hiring means it’s highly likely that bias will creep in, it simply has to, right? If more men called Steve are hired for jobs as classroom teachers, over time why wouldn’t the software learn to favour Steves over, say a Jose Luis?
Yet we push on further and further towards the goal of fully automated recruitment and why not? Getting the very best candidates is not a goal to be ashamed of, it’s how businesses of every shape and size succeed and ultimately drive up their profitability. Yet the fact remains that while AI can go a long way in streamlining the process it cannot, as yet, take on the full responsibility of sifting, selecting and analysing video interviews before making a hiring a decision. Not just yet.
Those very human elements of Human Resources, we mentioned, still need to fall under the remit of our faulty, broken, biased brains – at least until we can create algorithms that don’t learn to exclude people from poorer backgrounds, or someone with the wrong name and can instead become the better version of ourselves.