Linzi Penman and Ishbel MacPherson: AI recruitment tools are the future, and offer big cost and time savings

AI recruitment tools are revolutionising recruitment. The latest tools have the ability to profile individuals based on CVs and publicly available social data, select candidates and offer an interview, all by automated means without any human involvement. Genius, right? The cost and time savings are potentially dazzling. But before blindly signing up, employers need to consider a couple of bumps in the road which must be navigated with care.
Linzi Penman is a Senior Associate with DLA PiperLinzi Penman is a Senior Associate with DLA Piper
Linzi Penman is a Senior Associate with DLA Piper

In our December article in The Scotsman, we discussed the potential for AI recruitment tools to inadvertently lock in historical biases and how to identify and combat this. Profiling of candidates by AI recruitment tools also involves risks arising from the candidate’s right to privacy.

Most of you will be aware of GDPR – the EU regulation on data protection which took effect on 25 May 2018 – either from the flurry of emails asking for your consent to continue marketing around that time, or from the recent news stories discussing fines in the tens of millions for breaches such as failures to properly obtain consent from individuals or for lack of transparency about data handling. The reputational cost of GDPR failures can be even more significant for companies. What lessons should be learned from these recent decisions for employers using, or considering using, AI in their recruitment process?

Hide Ad
Hide Ad

The GDPR requires companies to provide clear and transparent information about any actions taken which involve the personal data of “data subjects” (eg employees and applicants), including profiling.

It is unlikely that, at the point of collection, the use of CV data (in combination with other sources such as AI) will have been anticipated by previous candidates. If you propose to use an AI recruitment tool which profiles candidate CVs, you should either ensure that applicants are informed about the likely uses of AI in relation to their CVs, or consider whether all of the CV data inputted into AI systems is (or can be) completely anonymised. (It is worth noting here that anonymisation is an extremely high threshold in the world of data, and means literally that all personal information has been removed such that it is no longer possible to identify an individual from the database.)

The GDPR requires employers to provide candidates with ‘meaningful information about the rules involved’ in profiling and any automated decision-making, as well as the consequences of this for the candidate. For example, if the computer identifies that someone does not have a university degree; will they automatically be told that their application has been unsuccessful?

Due to the commercial value of keeping their algorithms secret, AI suppliers are generally renowned for their lack of transparency about their technology. If you want to make use of the capabilities of machine learning AI recruitment tools, you will need to speak to your AI recruitment tool supplier and ask them enough questions about how their AI tool works so that you can provide candidates with clear information as to how their personal data will be used and how decisions will be taken.

The GDPR also gives people the right not to be subject to automated decisions where that decision could have a legal or similarly significant impact. This includes decisions that could impact employment status. To overcome this restriction, employers need: 
l to be able to demonstrate the automated decision-making was “necessary for the purpose of entering into the employment contract” – which is a high threshold; 
l to introduce meaningful human intervention; or 
l consent.

Obtaining consent may seem like the easy option but it has long been established that consent is rarely valid in an employment situation due to the imbalance of power between the employer and the applicant. Consents given in the recruitment scenario are particularly unlikely to be viewed by regulators as ‘freely and fairly given’ – as a failure to give that consent could prevent the individual from applying (successfully or otherwise) for the job. A route that is likely to be more compliant is for employers to introduce ‘meaningful human intervention’, for example by having a recruitment consultant or HR manager check the result of an automated decision before applying it to the applicant.

So, is it worth it? If you take steps to ensure compliance with applicable requirements, yes. AI recruitment tools are the future. Taking the time to avoid the potential privacy pitfalls is a small price to pay for the better outcomes and significant cost and time savings for your business.

Linzi Penman is a senior associate and Ishbel MacPherson is a legal director with DLA Piper