Can Technology Improve Diversity? - Putting Applied to the Test [Part 2]

Published by:
Kate Glazebrook
October 3, 2016
1
min read

Debiased Hiring Bundle

Originally published on Medium, October 3, 2016 (read part 3 here).

We recently published the headline figures from our first experiment putting our recruitment platform, Applied, head to head with traditional CV sifting. It’s a tool designed by behavioural scientists to help organisations hire more fairly, and more effectively. We found it made a marked difference in who we hired as part of a graduate round for the Behavioural Insights Team, and that CVs were pretty hopeless as a sifting tool.

But what about diversity, you ask? Since Applied is designed to help organisations do a better job of harnessing the benefits of having more diverse teams, we were keen to see that it delivered on that promise. Well we had some surprises here too.

It doesn’t look like we had any real gender bias in either of our processes. But the Behavioural Insights Team is a pretty gender balanced team already so that might mean we’re less likely to see bias in CV sifting anyway.

We saw directional evidence that Applied was less biased against people from non-White backgrounds and with a disability. But given sample sizes, we can’t prove this statistically.

We found that candidates with degrees from so-called ‘better’ universities and who had previously worked for name brand employers tended to do better in both of our sifts. But given that the Applied sift relies solely on how candidates respond to work challenges and reviewers have no way of identifying them or connecting them to prior experiences, it’s clear that we were better able to parse out those candidates that learned something out of those opportunities and could convert it into skills we we were looking for.

The area where we really saw differences seems to have been on educational background: on average, candidates that got through Applied had a wider array of attainment levels, whereas it looks like our CV sift was much more focused on attainment as a metric. This is relevant in light of the growing number of organisations moving away from using educational attainment as an application threshold, and the fact that there is increasing evidence from the likes of Google and IBM that the grades you get at university are pretty irrelevant to whether you’re good on the job, especially a few years post-graduation.

What we saw here is some evidence that using Applied made for a fairer process, relative to traditional CV sifting, and may be especially good for promoting social mobility based on educational background.

In part 3, we tackle the question most HR folks are hankering to find out: was it more efficient?