Third Sector Recruitment: Ultimate Guide to Bias-Free, Mission-Driven Hiring

Published by:
Joe Caccavale
October 29, 2021
9
 min read

How do you balance diversity, retention and talent?

No matter how tight your budget, there are proven, behavioural science-based steps you can take to tangibly improve the diversity, predictivity and efficiency of your hiring process.

Organizations using these practices tend to see:

  • Up to 4x attraction and selection of ethnically diverse candidates
  • 3x as many suitable candidates
  • 93% retention rate after one year
  • 9/10 average candidate experience rating (including unsuccessful applicants)


Why is diversity so stagnant, even in the third sector?

If you’re reading this, chances are you already know the stats: the third sector is failing to reflect the diversity of the wider population as well as the people it serves.

Diversity in the charity sector (chart)


But given the ethical nature of charity work, why is this?

The reality is - no matter how well-intentioned your non-profit may be, all humans are prone to unconscious bias.

Unconscious bias isn’t something only perpetrated by bad actors…

It’s a necessary tool we use to make sense of our complicated world.

We take mental shortcuts and make quick-fire associations to deal with the 1000’s of micro-decisions we make every day. 

Our brains use shortcuts and patterns to draw conclusions, by using the information subconsciously stored in our mental lockers.

Whilst these tendencies are fairly harmless in everyday life, they can lead to negative consequences when it comes to making hiring decisions.

Our subconscious inclination to favour those most like ourselves and resist the unfamiliar means that candidates from underrepresented backgrounds are disproportionately overlooked.

Hiring discrimination by ethnicity (chart)

Everyone has biases, regardless of their role or the industry they work in.

If we want to improve diversity in third sector recruitment, the only way to do this is by designing bias out of the hiring process.

You can’t debias a person

But you can debias a process.

Anonymize your screening process to remove bias

When it comes to reviewing a CV, our brains tend to misfire in all sorts of ways.

Whether we’re aware of it or not, someone’s name, age or perceived ethnicity can all influence our opinion of them.

The easiest way to remove all of this irrelevant noise from decision making is to anonymize applications completely.

Given that the average CV review takes just 7.4 seconds, it’s fairly safe to assume that most screening processes rely heavily on ‘gut instinct’ to make intuitive decisions about someone's suitability.

The problem with this is that for the most part, what you might call your ‘gut instinct’ is actually just unconscious bias.

Although only a single piece of the puzzle, anonymization is a simple yet highly effective practice when it comes to removing bias.

Since we can’t completely debias reviewers themselves, it's far more effective (and cheaper) to simply anonymize information that is likely to trigger bias. 

Use predictive assessments instead of CVs

So, you’ve removed all identifying information from a candidates’ CV…

What are you left with? Academic achievements and work experience.

Well, what if we told you that these are actually weak predictors of job performance?

Predictive validity is a measure of accuracy used in science and psychology - a means of quantifying how likely a given assessment is to predict future outcomes.

By making sure we replace CVs with predictive assessments, we don’t have to trade quality of candidate for diversity.

Predictive validity studies take a long time to complete and demand large sample sizes, but we can look to the Schmidt–Hunter meta-analysis for answers - which summarises the findings of 100 years of research in personnel selection.

Predictive validity of assessment methods (chart)

How to use work samples

Here at Applied, we use 3-5 ‘work samples’ to anonymously screen candidates.

Work samples take small parts of the role and ask candidates to either perform or explain their approach to them.

By simulating tasks that would realistically occur in the role, you can directly test candidates’ skills without relying on flawed proxies like education and experience.

This approach means that every candidate gets a fair chance to showcase what they can do regardless of how or where they acquired their skills.

Here’s how we build work sample questions at Applied:

Identify skills: what are the 6-8 core skills needed for this job?

Identify tasks: what tasks will this person be doing in the role?

Create rubric: what does a good answer look like?

If you’re looking for candidates who’ll embody your organization’s mission and values as well as the skills needed for the job, you can test for these too by treating them as if they’re skills.

Work sample question example

In the example above, you can see how work samples can be used to test for values like transparency, as well as more generic skills like communication and creativity.

Turning mission statements into values/skills

This a fairer and more empirical way of testing a candidate's alignment than a more typical ‘culture fit’ assessment round.

Culture is fairly subjective - and will tend to reflect the dominant demographics that make up your organization.

If we look at the study below, we can see culture fit doesn’t actually tell us much about how well someone will perform in the job, or how long they’ll stick around.

Culture fit vs Job performance (chart)


This is why we look for mission and values alignment instead. We want to know if candidates are as passionate about our mission and ways of working as we are… and not how much fun they’ll be at the local pub.


Data-proofing your hiring with a scoring rubric

Since we know our judgment can be easily warped without us even being aware of this happening, it's essential to have a scoring rubric.

Giving yourself basic criteria to score against means that hiring decisions will be tied to objective requirements, rather than ‘gut instinct’.

For each work sample and interview question you create, you’ll need a review guide to score answers against.

We’d recommend starting out with a simple 1-5 star scale - with a few bullet points noting what a good, mediocre and bad answer might include.

Example review guide

For the most accurate, unbiased scores, have three team members score each assessment round.

Crowd Wisdom, is the general rule of thumb that combining answers from a diverse crowd produces better results than asking one individual.

Not only will having multiple reviewers lead to more predictive scoring, but it will also average out any individual’s biases.

And the more diverse your panel, the less biased your final scores should be.

At the end of the process, you can add up or average out each candidate's scores to make a data-driven hiring decision.

Structure your interviews

Although we can anonymize the screening stage, interviews are much more difficult to debias.

Any time we meet someone face to face, there will always  be a degree of bias at apply.

However, there are steps you can take to minimize this bias and turn your interviews into more predictive assessments, rather than informal chats.

Adding structure to your interviews to make them as uniform as possible will make it easier to compare like with like. 

Ask all candidates the same set of questions in the same order.

As for the questions themselves, the best practice here is to use work sample-style questions to test skills, instead of probing into candidates’ backgrounds.

Since you’ll have candidates in front of you (in person or digitally), you can use both case study and work simulation questions to see how candidates would think and work, should they get the job.

Case studies: take a larger task or project that candidates would be working on - give candidates all of the relevant context and then ask a series of followup questions.

For a marketing role, for example, you could give candidates some Google Analytics data and ask them how they understand it, where you might be going wrong and what could be done to improve it.

If the task is particularly demanding, you can always give candidates the case study to think about ahead of the interview. If thinking on the spot isn’t a required skill for the role, then there’s no need to spring tricky questions on candidates.

Work simulations: the most predictive means of assassin candidates is by having them role-play tasks in the interview. Whilst this won’t be possible for all roles, commercial roles easily lend themselves to work simulation tasks. 

You could ask candidates to role-play sales calls, client meetings or presentations. Instead of asking candidates how they’d deal with a given task, have them demonstrate their skills by performing the task itself.

Psst. You can download our (free) Charity Interview Question Cheatsheet here.

Track and report on diversity

If you want to improve diversity, you’ll have to start measuring it…

Although a fair and ethical hiring process should be anonymous, you still need to collect diversity data in order to ensure your process is fair and that diversity is maintained throughout the hiring funnel. 

The idea of asking candidates for personal information about their identity and socioeconomic background isn’t always best received.

This is why before asking for any personal background information, it’s vital to communicate to candidates that this information is only ever going to be used at an aggregate level (meaning candidates will remain anonymous).

And sharing this information should never be mandatory.

Below you can see how we do this at the top of our equal opportunities form.

Equal opps form explainer copy

Three key areas to track...

Sourcing: making sure you start out with a diverse candidate pool will significantly increase your chances of improving diversity over time. If your budget is limited, then start tracking which job boards bring you the most diverse set of candidates - job board postings are expensive and not all job boards were created equal!

Fairness of your process: if you notice that candidates from a particular demographic tend to drop off at a particular stage in your assessment process, it may be one of your questions or accompanying review guides at fault. Corporate hiring processes in the U.S, for example, have been shown to prefer a masculine style of leadership. If you’re using scoring criteria for each question, you should be able to identify exactly where the issue is by looking at the disparity in scores.

Overall diversity of hires: to be able to report on the success of your diversity initiatives, you first need to establish a baseline. We’d recommend starting out by tracking ethnicity, gender, sexual orientation and disability status. Then, once you’re ready to take the next step, you can begin to track socioeconomic indicators (this data can be collected by asking candidates about their parents' education/income).

Here’s what diversity reporting looks like within the Applied platform.


Applied is the essential platform for debiased hiring. Purpose-built to make hiring empirical and ethical, our platform uses anonymized applications and skill-based assessments to identify talent that would otherwise have been overlooked.


Push back against conventional hiring wisdom with a smarter solution: book in a demo