Fair Hiring 101: Your Go-To Manual for Inclusive Recruitment

Published by:
Joe Caccavale
February 1, 2021
min read

IN THIS GUIDE: We’ll walk you through everything you need to achieve genuinely fair hiring, without spending a penny.

All of the steps below are based on behavioural science and are backed by research.

The numbers don't lie - here are some of the outcomes organizations using this process tend to see:

  • Up to 4x attraction and selection of ethnically diverse candidates
  • Traditional hiring would miss 60% of Applied hires
  • Reduce time spent interviewing unsuitable candidates
  • 96% retention rate after one year
  • 9/10 average candidate experience rating

The problem with traditional hiring 

Your standard hiring process is anything but ‘fair.’

Nobody likes to think of themselves as being biased…

But we’re all prone to unconscious bias - it’s just a part of being human.

WHAT IS UNCONSCIOUS BIAS? We all have prejudices which we are unaware of.

We tend to categorise others based on physical qualities, like gender, ethnicity etc and make judgments about others based on mental shortcuts and stereotypes.

When it comes to hiring, the more we know about a candidate, the more likely we are to form biases against them - without even being aware of this happening ourselves.

And as a result, candidates from minority backgrounds end up being disproportionately overlooked...

Hiring discrimination - a look at the numbers

In the US: A 2004 study found that candidates with an African-American name would need an extra 8 years of experience to get the same number of callbacks as a candidate with a white-sounding name.

White sounding name vs black sounding name bias

In the UK: Inside Out London’s study found that candidates with a Muslim-sounding are 3x more likely to be passed over for a job.

Adam vs Mohammed callback rates

In Germany: One study found that candidates with a Muslim-sounding name, who were also pictured wearing a headscarf, were 15% less likely to receive a callback.

Callback rates for Muslim name and headscarf in photo

This bias doesn’t just affect ethnic minorities, females are also discriminated against...

In the US: According to a study of university science faculties, when the name on an application was female, candidates were seen as being less competent and hireable.

Science faculty application reviews by gender

In Spain: A similar study conducted in Madrid and Barcelona found that hirers favoured male candidates and that women’s callback chances were worse affected by having children.

How parenthood status affects callback rates

How does bias happen?

To get us through our everyday lives, our brains rely on mental shortcuts and associations to make decisions.

If you had to think consciously about every single micro-decisions you made throughout the day, you’d never get anything done.

However, this intuitive system of thinking can lead to bias.

If we see flashing lights and someone in a high-visibility jacket, we associate this with emergency services.

This is relatively harmless - if not actually useful. 

But if we see someone of a certain race, for example, we might (subconsciously) associate them with a set of characteristics/stereotypes…

And these addictions don’t disappear when we’re looking at CVs or sitting in an interview.

Note: unconscious bias isn’t just discriminatory - some people may be perceived as more favourably than others.

Fair chance hiring practices: our behavioural-science based process

Cast your net wide with inclusive sourcing

Fair hiring begins at the sourcing stage.

How you build your job ads will dictate who applies.

The first thing you'll want to avoid is listing too many requirements.

Whilst some people will read them as a general guide as to who should apply, others will see them as strict criteria which must be fulfilled.

Ask yourself this: how many of the requirements you've listed are genuinely required to do the job?

You've asked for 5+ years of industry experience...

But what if a great candidates with just 4 years under their belt decided to qualify themselves out?

Now, factor in the knowledge that people from minority backgrounds find it harder to gain experience (we saw this in the studies above) and that women tend not to apply for roles unless they meet 100% of the criteria, whereas men who will apply meet only 60% of the requirements.

Listing too many requirements isn't just a matter of being picky, it could actively harm diversity.

The diversity of your initial candidate pool matters.

Having one or two token candidates from an underrepresented group doesn't lead to fairer outcomes.

Studies have shown that when there’s just one woman in the finalist pool, their chances of being hired are statistically zero (this is because it highlights how different she is from the norm).

Finalist Pools vs Hiring Outcomes

The mistake many organizations make is over-relying on referrals to source candidates.

Whilst they can be quick and cheap means of finding talent, referrals often lead to diversity gaps being perpetuated.

This is because referred candidates tend to reflect the employee who referred them.

According to PayScale’s report, female and minority applicants were significantly less likely to receive a referral than their white male peers.

You don't have to turn your back on referrals completely, just make sure it's only ever one of many channels.

You should also ensure that the employees you've asked to refer candidates are themselves a diverse group.

If referred candidates reflect their referrer, then a diverse group of employees should yield a diverse pool of candidates.

Key takeaway: cut down on requirements and make don't rely on referrals

Anonymize your screening

If you truly want to achieve fair hiring, ditching CVs is a must. 

The information that CVs reveal - such as name, age, address - triggers biases and leads to discrimination.

Recruiters spend an average of six seconds reviewing a CV.

Which indicates that the fast, shortcut-based system of thinking is being used to make decisions.

Whilst some organisations address this by using anonymised CVs, this only solves part of the equation, even if it is a step in the right direction.

If you were to sift through a pile of CVs right now, what would you be looking for?

Candidates that ‘jump out’ at you?

And what would make someone ‘jump out’?

Chances are, it’s either their education and experience (or a combination of the two).

The problem is: education and experience also lead to bias - and aren’t predictive of actual ability.

If we simply assume that the best candidates come from the most prestigious universities, then candidates from underprivileged backgrounds are always going to be overlooked.

If you go to one of the best universities, you’ll likely get the best experience as a result.

And so the top jobs will continue to go to those from the most privileged backgrounds unless we change the way we hire.

Whilst fair hiring should be a priority, you also want your process to find the best people…

And if you're using CVs - your process isn’t fair or effective.

Looking at the results of this famous metastudy, you can see that education and experience aren’t ‘predictively valid.’

In other words, they suck at telling us who the best person actually is.

Predictive validity of assessment methods

So, if CVs aren’t the way forward, what is?

Well, the fairest way to assess candidates is by testing skills.

Education and experience can indeed help to develop and sharpen one’s skills.

You’re not expected to hire a team with no experience at all.

It’s just that experience itself doesn’t make someone the best person for the job - their skills do.

And trying to guess how someone’s skills would stack up just by looking at their work history isn’t the most scientific way of assessing candidates.

According to the study above, the most effective means of assessment is called a ‘work sample.’

WHAT ARE WORK SAMPLES? Work samples take parts of a role and turn them into assessments by asking candidates to perform the task or explain how they would go about doing so.

The philosophy behind work samples is fairly simple: the most predictive way of testing someone’s ability to do a job is by getting them to perform chunks of it.

Since we want to test for skills - work samples are built by identifying which skills are necessary for the role and then building questions that will test them.

Here’s a work sample we used for a recent Community Lead role:

You’ve been invited to be on a panel on hiring & recruitment. You’re the only D&I expert (possibly the only one that thinks it’s important there) in the room. What are your opening lines to the audience to convince and engage them on the subject?

Skills tested: D&I Knowledge, Communication

The more realistic your work samples are, the more predictive they’ll be.

You could take a one-off scenario (like above) or everyday task and turn it into a work sample.

All you have to do is pose the situation hypothetically by asking - ‘what would you do?’

Work samples don’t require candidates to have any specific experience or come from a specific background. If you can do the job then you have the chance to prove it - free from bias.

At Applied, candidates submit work samples completely anonymously - we have no idea who candidates are or where they’re from. 

Key takeaway: fair hiring starts with de-biased screening, use 3-5 work samples instead of CVs. 
Work sample cheatsheet

Add structure to your interviewing

Interviewing is difficult to de-bias entirely.

But there are some tweaks you can make to ensure fair hiring.

Most interviews generally start to go wrong when the conversation is taken off on tangents, delving into candidates backgrounds.

The more uniform you make each interview, the fairer the process.

This type of interview is known as a ‘structured interview’.

WHAT ARE STRUCTURED INTERVIEWS? This is when you ask every candidate the same questions in the same order.

Given what we know about education and experience, keep questions hypothetical.

 Rather than asking candidates about a time when they did X, ask how they would do X if it were to happen.

***Disclaimer*** Years of experience may well make someone the best candidate, we would never dispute that. Experience is extremely valuable. But not all experience is equally valuable, and it may not be the specific type of experience you’re looking for that makes someone the best candidate. Work sample-style questions test for skills learned through experience, not experience itself.

You can also ask candidates about how they understand certain topics or what their recipe for success would be.

Here’s an example interview question we used for the same Community Lead role as above:

There are a lot of businesses that claim to have a community. What does a successful community look like to you? How would you measure it? 

Skills tested: Data-driven, Community Knowledge

You can see more real-life examples in our Interview Playbook.

Rather than attempting to deduce how someone would take to the role or what actions they’d take if they were to be hired, you can simply ask them.

This offers you an insight into how candidates would think if they got the job.

Taking this one step further, you can ask candidates to simulate tasks associated with the role.

Presentations, mock client calls, prioritising tasks or planning a project could all be turned into interview tasks. 

Key takeaway: keep interviews consistent and ask work sample-style questions

A note on ‘culture fit’

Whether it be at the screening or interview stage, most hirers would like to get a sense of how candidates would ‘fit’ with the company culture.

Can you test for culture and claim to have fair hiring?

This depends on what you’re calling ‘culture’.

If you see culture as a struct set of values that candidates have to fit into, then this could be problematic.

A company founded by a certain demographic, who hires predominantly people of that same demographic will likely have a culture geared towards that type of person.

If you then hire for culture fit, you’re essentially looking for someone who fits into that demographic.

Culture should be something that evolves over time and is built upon.

Then it’s not a matter of candidates fitting in. 

It's a matter of what they can add. 

Here at Applied, we took a step back from culture altogether.

We test for ‘mission and values alignment’.

We’re on a mission to spread the word about fair hiring.

So we want to know how passionate candidates are about this mission and that they agree on the values necessary for the team to achieve this.

How much of a laugh someone would be at the pub doesn’t matter to us.

What we want to know is: do they see the value in what our company is doing?

Here are some example questions we’ve asked candidates:

  • What are you hoping to get out of this role? How does it contribute to your longer-term ambitions? 
  • Why do you want to join the team? Why now?
Key takeaway: test for mission alignment rather than culture fit

Data-proof your process to make fairer decisions

When left to make a decision based on who we think is the right candidate, unconscious bias tends to take over.

Some candidates may be very obviously more suited than others, but how can you test this objectively?

Even when using work samples, you’d still be putting through candidates whose answer you felt were the best.

The solution: give yourself scoring criteria for each work sample and interview question.

Your criteria can be as simple as a 1-5 star scale, with a few bullet points describing what a good, mediocre and poor answer would look like.

Below is an example work sample question with criteria...


What is your favourite SaaS website and why? How does it encourage inbound leads to get in touch (calls to action, sign-ups, chatbots) and how do they do a good job of this?

Review guide

1 star

  • No real effort made

3 star

  • Picks a website purely based on design or aesthetics
  • Speaks less about the inbound marketing aspects of the website

5 star

  • Picks a website that looks great and has a design that makes sense.
  • Describes aesthetic and why it is relevant to customer/product
  • Talks through lead gen strategy for page and why it is great

You can quickly draft review guides with this Interview Scoring Worksheet

As you can see, the criteria (we call it a ‘review guide’) doesn’t need to be long or complicated.

Its purpose is to provide reviewers with something to actually score against, rather than going with their gut. Candidates will finish the process with an objective, numerical score - so there shouldn’t be any deliberation needed unless they’re neck-and-neck.

The review guide also means that you can invite other members of the team to help score candidates.

By using multiple reviewers, individual biases will be negated by the other reviewers. The more diverse the panel, the more objective the scores should be.

This is a phenomena known as ‘crowd wisdom’ - the general rule that collective judgment is more accurate than that of an individual.

Ideally, you should have three reviewers scoring work samples, and another three reviewers at the interview stage.

If you have a team large enough, you should also switch up the panel for each interview round (you can have one interviewer stay throughout to make candidates feel more comfortable).

Review panel example

Key takeaway: have multiple reviewers judge candidates against set criteria for the most unbiased scores.

Use feedback to nail your candidate experience

If you are going to be a champion of fair hiring, make sure you share this with candidates.

Before giving them their feedback, let them know how they were assessed and that bias was removed from the process.

In a world where most candidates are lucky to even get a rejection email, applicants will appreciate your transparency.

Using de-biased, skills based testing means that you can give candidates useful feedback.

You can simply tell them where they performed well, and where there was room for improvement.

Fair hiring is about testing skills, not personality, so make sure your feedback is strictly around skills.

Use our Candidate Feedback Template to get started.

Once you’ve fully established and fine tuned your process, you can provide more comprehensive feedback.

Below is what Applied candidates receive from us at the end of the process (which is automated through the platform).

Candidate feedback example

This might look time consuming.

Especially when the bar is set so low for giving feedback.

But if you care about candidate experience, feedback like this is your secret weapon.

Across 300,000+ applications, candidates gave us 9/10 for candidate experience on average. 

Candidate feedback chart
Key takeaway: share unbiased scores with candidates.

Putting it all together: fair hiring in 4 steps

  • Screening: Use work samples instead of CVs
  • Interviewing: Switch to structured interviews 
  • Data-proofing: Give yourself a review guide
  • Feedback: Use scores to give personalised feedback

Applied is the essential platform for debiased hiring. Purpose-built to make hiring empirical and ethical, our platform uses anonymised applications and skill-based assessments to identify talent that would otherwise have been overlooked.

Push back against conventional hiring wisdom with a smarter solution: book in a demo