Ethical Hiring Practices: 5 Steps to a Fairer Recruitment Process

Joe Caccavale

26

May

2021

7

min read

|

X
Free Training: How to De-bias Your Hiring
Learn how to build a more ethical, empirical hiring process in a single session.

X

Sign up to our newsletter

Want to see more content like this, every other week?
We're on a mission to change recruitment.

Sign Me Up

What is ethical hiring?

Ethical hiring means assessing candidates without discrimination. We’re all subject to unconscious bias and so we have to implement practices that remove this bias and ensure every candidate gets a fair chance. 

1. Start by anonymising your screening process

The more we know about a candidate, the more grounds for bias there is.

And the consequences of unconscious bias are real and measurable.

We have a tendency to make snap, subconscious associations and resist the unfamiliar, which leads to candidates from minority backgrounds being disproportionately overlooked.

A 2019 study from the University of Oxford looking at ethnicity found that candidates from minority ethnic backgrounds had to send 80% more applications to get the same results as a White-British person (this bias isn’t exclusive to the UK, studies over in the US have had similar findings).

Extra applications needed to receive a callback (chart)



It’s not just ethnic minority candidates who are overlooked.

Your disability status, sexual orientation and gender can all have an impact on your chances of receiving a callback.

This is why one of the most fundamental ethical hiring practices is anonymization.

Key takeaway: Remove all identifying information from applications. This includes things like names, addresses and date of birth - anything that gives away an aspect of someone’s identity


2. Use skill-based assessments


To achieve truly ethical hiring, we have to go a step further than anonymization.

Imagine you have a candidate’s CV in front of you…

You’ve now covered any identifying information, so what’s left?

Education and work experience.

The problem here is that if we simply assume that the best candidates come from the most prestigious universities and have experience at big-name companies, then candidates from underprivileged backgrounds are always going to be overlooked.

We know that biases tend to lead to these people being missed in traditional hiring processes.

So how can we expect them to have the background we usually look for?

Over-indexing on education and experience isn’t just unfair, it’s also a pretty inaccurate way of finding talent.

As you can see from the study below, they’re some of the weakest predictors of ability.

Predictive validity of hiring methods (chart)


You’ll notice at the top of the chart are what we call ‘work samples’.

Work samples are interview-style questions designed to the specific skills required for the job.

They take a realistic task or scenario that candidates would encounter in the role and ask them to either perform the task or explain how they would go about doing so.

The idea is to simulate the role as closely as possible by having candidates perform small parts of it.

Work samples are similar to your typical ‘situational question’ except they pose scenarios hypothetically - focussing on potential over experience.

Below is an example for a Sales Development Representative Role:

Question: We run free training days in order to help Talent professionals de-bias their recruitment processes and understand how behavioural science impacts diversity & inclusion.

Once people understand the science, the chances of them becoming a customer are pretty good! *We provide candidates with a link for more information about our training days.

You have built a list of 1000 Heads of Talent in the US. Write an email explaining who Applied are, and inviting them along. Remember at this stage we aren't selling them the platform, just trying to get them to come along to the training day. 

In your answer, sign your name with an "X" rather than your name to keep the answer anonymous.

Skills tested: Research, Communication

Rather than ask candidates to tell you about their experience and how they translate into skills, work samples put these skills to the test by getting candidates to think as if they’re already in the role.

Testing for skills using work samples means that all candidates are given a fair and equal chance to showcase their skills, regardless of their background.

Key takeaway: Use 3-5 work samples to anonymously screen candidates
Work sample cheatsheet


3. Data-proof your hiring with scoring criteria

Let’s be clear about one thing: gut instinct = unconscious bias.

We all have biases, it’s an inescapable part of being human.

But if we don’t design around this bias, our decision-making is anything but objective.

Where most hiring processes go wrong is not having any means of quantifying who the best person is. Instead, the person who hirers ‘like’ the most is hired.

To data-proof your hiring, you’ll first need to give each screening and interview question a review guide/scoring criteria.

At Applied, we use a 1-5 star scale to score each question.

This scale is accompanied by a few bullet points, detailing what a good, bad and mediocre answer would include.

Question: 

We're in the process of reaching out to management consulting firms as part of this week's marketing campaign:

Draft an email to the Head of HR, introducing Applied, and trying to arrange an introductory call.

Scoring criteria:

# 5 Stars 

  • Concise message, with a clear structure and well written. 
  • Skilfully and sensitively personalised to their particular needs.
  • Correct and clearly articulated understanding of what Applied is.
  • Some use of evidence or 3rd party comparison to demonstrate benefits/generate interest.
  • Not entirely focused on diversity benefits. Linking to the wider talent picture.

# 3 Stars

  • Good understanding of what Applied is.
  • Clearly written with good spelling/grammar and clarity/structure.
  • Personalised to the prospect.
  • Orientated around starting a conversation, not "salesy"
  • A clear call to action.

# 1 Star

  • Not personalised to the business, or accusatory and insensitively handled
  • Lack of clarity of what Applied is.
  • Poorly written, in either spelling/grammar or clarity/structure.
  • Overly “salesy”. Trying to sell rather than initiate a relationship.

Whilst most ethical hiring practices you’ll find online will tell you how to avoid explicit discrimination, data-proofing your process will enable you to not only remove bias but also have the data to prove it.

You can use candidate’s scores to provide useful, objective feedback and show candidates that they were assessed fairly against the skills needed for the job.

Hiring decisions won’t be made on the candidates you ‘felt’ were the best. You can simply offer the job to the highest scorer.

Key takeaway: Score answers on a 1-5 star scale


4. Structure your interviews

A structured interview is where all candidates are asked the same questions in the same order.

The idea is to make interviews standardised so that candidates are being compared fairly and against specific criteria.

Since questions are pre-set, structured interviews ensure that all questions are relevant to the job, keeping background out of the decision-making process as much as possible.

The questions you do choose to ask also matter.

The most ethical way to approach interview questions is to ask forward-looking questions.

This means asking questions hypothetically, rather than questioning candidates about their experience.

If there’s a skill you’d like to test, simply think of a scenario that would test this skill (just like you would for the work sample screening questions) and ask candidates how they’d tackle it.

Posing questions in this way shifts the focus from background to potential. 

Whilst previous experience and education may well make for the best answers, forward-looking questions make no assumptions so that every candidate is given a fair shot.

You can also use what we call’’ ‘case study tasks’.

Case studies present candidates with a larger task to work through. You lay out all of the context and show candidates any relevant material (charts, user personas, briefs etc) and then ask a series of follow-up questions to get an idea of how they’d think and work through the task if they were to get the job.

Below is a case study we used for a Digital Marketer role:

Question: Below is some fake data to discuss. ​​To meet our commercial targets we think we need to increase our ​demo requests​ from 90/month to 150/month. Below are some fake funnel metrics and website GA data. With a view to meeting this objective, talk through the above data and what it might mean.

Follow-up 1: What additional data would you need to work out how to meet the objective?

Follow-up 2: Given the objective, where would you concentrate your marketing efforts? Is there anything that you would do immediately? Where is the worst place to spend your time, given what you see in the data?

Just like your screening questions, each interview question should have its own scoring criteria.

Below is an example of how a question is scored at Applied. It’s recommended that interviewers take notes so that they can score candidates immediately following the interview or even score answers in the interview itself.

Interview scoring example
Key takeaway: Ask all candidates the same work sample-style questions

5. Use interviewer panels to negate bias

Even with structured interviews and forward-looking questions, there will still be bias at play.

A single interviewer will have their own biases which will likely affect their scoring (although this would still be much fairer than a traditional interview).

The most effective means of mitigating this bias is to use multiple reviewers.

By having a new three-person panel for each round, any individual biases will be averaged out over the course of the process.

And the more diverse your panels, the fairer the scores will be.

Reviewer panel example



This isn’t just a more technical hiring practice, it’s also more empirical.

Collective judgement is generally more accurate than that of an individual - a phenomenon known as ‘crowd wisdom’.

Key takeaway: Have three reviewers for each assessment round

Applied is the essential platform for debiased hiring. Purpose-built to make hiring empirical and ethical, our platform uses anonymised applications and skill-based assessments to find talent that would otherwise have been lost.

Find out how we’re pushing back against conventional hiring wisdom with a smarter solution: book in a demo