Is recruitment AI discriminating against job seekers?
Empiric Empiric

Is recruitment AI discriminating against job seekers?

img
img
Published 15/10/2018

Is recruitment AI discriminating against job seekers?

Artificial Intelligence (AI) can be highly effective when applied to decision making; AI is consistent and objective and can act as an impartial judge which makes it a very attractive tool for recruitment. 

For example, replicating the process of blind orchestra auditions (in which musicians played behind a curtain), 50% more women were likely to be selected when compared to a traditional recruitment process. When applied to recruitment, AI should act as a driver for workplace diversity.

However, that theory hasn’t always translated into practice.

img
img

It has recently been revealed that Amazon developed an AI hiring tool that appears to have delivered systematically discriminatory results.

The objective of the programme was to create an algorithm that would be able to identify the best candidates from a stack of CVs and that would track patterns in order to deliver better recommendations over time. 

Because the majority of CVs supplied over the previous 10-year period had been from men the algorithm began to optimise itself in their favour. It reportedly even downgraded graduates of all-female colleges and applications that featured the word “women’s”.

Lessons learnt

The issue was spotted, addressed and the programme formally ended in 2017 but it did raise the issue that the AI is only ever as good as its data. If populations are excluded from the dataset, or are heavily weighted towards certain outcomes, then the algorithm will continue the trendlines that it has been provided with – or average the subset out with the rest of the group. What’s worse, as seen here, is when machine learning algorithms lean into existing inequality and exaggerate it.

And given that AI is steadily advancing in HR, with AI expected to automate 16% of HR jobs in the next decade, it’s important issue.

The optimal end goal is of course for organisations to achieve the benefits of blind testing: getting a sense of precisely how well the individual will perform in a role, while discarding irrelevant details. 

Amazon’s dilemma presents a cautionary tale, underlining the need for human operators to oversee algorithms – and to be able to audit precisely why they do what they do (which may be difficult to achieve given the complex nature of the calculations at hand).

img
img

Empiric is seeking to address the lack of gender diversity in the tech sector at a broader level through its Next Tech Girls initiative.

The Next Tech Girls programme helps young women find career paths into tech by connecting them with work experience placements, mentors and female exemplars – both championing women at the top as well as driving change from the first rung of the ladder upwards.

img

About Empiric

Empiric is a multi-award winning business and one of the fastest growing technology and transformation recruitment agency's specialising in data, digital, cloud and security. We supply technology and change recruitment services to businesses looking for both contract and permanent professionals.


Read more (pdf download)

Empiric are committed to changing the gender and diversity imbalance within the technology sector. In addition to Next Tech Girls we proactively target skilled professionals from minority groups which in turn can help you meet your own diversity commitments. Our active investment within the tech community allows us to engage with specific talent pools and deliver a short list of relevant and diverse candidates.

For more information contact 

02036757777

To view our latest job opportunities click here.

This website uses cookies to ensure you get the best experience on our website