Podcast: are recruitment algorithms up to the job?
Biased and unfair computer algorithms are dominating the recruitment industry and can even exclude the best candidates, according to a technology expert.
In the latest episode of The Scotsman's Data Capital podcast, Dr Nakeema Stefflbauer says: "Algorithms are programmed with a list of keywords in order of priority and the frequency with which they appear in CVs submitted for a job application.
"Those keywords will dictate whether or not the algorithm is going to rank the candidate as a strong candidate, a qualified candidate - or not. That doesn't take into account the people who haven't got the briefing and aren't aware they need to stuff their CVs with keywords!"
Dr Stefflbauer, an American academic now working in the technology sector in Berlin, says recruitment practices in Europe are way behind the United States.
She explains: "It's been such a long time that photographs are off limits, a total no-no and an obvious door to discrimination in North America. It's really shocking to see it's still an expected norm for job applications in Germany.
"Photographs are being used to discriminate between people who cover their hair for religious reasons, people of various ethnicities and racial backgrounds. This should make it even more obvious that the photograph is probably not necessary at all - but it still persists."
Dr Stefflbauer, who is also an investor in tech companies and founder of a social enterprise designed to get more resident, immigrant and refugee women into the tech industry, said many employers were overrun with applications and didn't have enough time to deal with them: "As a result, they have convenient tools, or algorithms, that are promising to sort through all the piles of applications quickly, efficiently and without bias (because most of them claim that). The specifics of how these algorithms are discriminating between applicants gets lost in the sales process."
This often leads to a very narrow and homogenous pool of candidates, Dr Stefflbauer argues.
She believes modern recruitment tools are, generally, not delivering the best candidate for a job - and says this is not being challenged because many people working in recruitment don't know what the algorithms do: "This is the problem with automated, decision-making tech. If nobody knows how it works, if nobody sees how the sausage is made, what the inputs are, you get these problems where you're trying to hire for a role - and you get [for example] no women.
"Maybe the algorithm shows that, historically, there has never been an Indian woman who's been chief sales officer in this German company. So your easily identifiable photograph that shows you are Indian and female is either a disqualifying or de-prioritisation factor for that algorithm. So the recruiter or employer is never going to find you because you're ranked low on the list of applicants."
Dr Stefflbauer says that from a sheer fairness perspective, anonymised applications are the way forward - and that the way algorithms are working must be carefully examined.
"There needs to be a real conscious effort to assess what jobs require, not just what certificates we like to see - because often, there's very little or no correlation between formalised education and performance on the job."
Organisations could look at what skills and abilities a job requires and feed them into an algorithm, says Dr Stefflbauer.
"It's shocking how few people give this the thought that's required. In general, [we] look at what the suppliers providing the algorithms tell us and think that sounds objective and fair. It's much easier to trust a computer; it's human nature because you don't see the people who've built and trained the algorithm. And you don't see any obvious bias."
One way of trying to fix the problem could be the new European AI (Artificial Intelligence) Act, Dr Stefflbauer suggests: "We could lay out some guidelines for the use of personally identifiable information in employment algorithms."
But it is also about individuals taking responsibility and not just looking to blame someone else for why the computer says no: “The mistake is to say, I know that these algorithms are not fair, but surely the legal officer will deal with it or maybe it’s the data officer’s job. The reality is it's all of our jobs and if none of us know how these algorithms work, eventually we will find ourselves in a situation where we are discriminated against.”
Listen to Computer Says No: Algorithms and Recruitment Bias here.
Dr Nakeema Stefflbauer speaks on 29th September at Data for Diversity, a conference developed by The Scotsman and the Data-Driven Innovation initiative, part of the Edinburgh and South East Scotland City Region Deal. Read the agenda and book your free place here