Updated 26/03/21 with additional steps to de-bias your hiring process.
Unconscious bias training is a waste of time.
Human decision making is prone to unconscious bias, and many of us are aware of the pernicious effects of implicit bias in the workplace. However, throwing money at the problem does nothing to improve this.
Bias can be removed from decision making, but as this article will explain, unconscious bias and diversity training isn’t the way to do this.
The rising popularity of unconscious bias training
If you look at what's happened to the Google results for unconscious bias training, you can see that it has spiked.
As the Black Lives Matter movement took off in its current form, you can see that there was a huge increase in searches particularly in the United Kingdom, Australia, United States and Canada.
It’s also been making headlines here in the UK.
Training is due to be scrapped for civil servants, following the revelation that it doesn't actually work.
Before we dive into the evidence behind this claim, it’s worth briefly covering what unconscious bias (and training to prevent it) entails...
What is Unconscious Bias?
When we talk about ‘unconscious bias’, we’re referring to prejudices that we may be aware of, but are out of our direct control. These tend to stem from stereotypes that are formed by our backgrounds, cultures, and personal experiences.
It’s worth noting that unconscious bias is natural and very much part of our psychological makeup. But, it can hinder our attempts to make fair, objective judgments.
Here are some of the most common types of unconscious bias:
Perception bias – we believe something is typical of a particular group of people based on cultural stereotypes or assumptions.
Affinity bias – we feel as though we have a natural connection with people who are similar to us.
Halo effect – we project positive qualities onto people without actually knowing them.
Confirmation bias – we have a tendency to try and confirm our own opinions and pre-existing ideas about a particular group of people.
In Daniel Kahneman’s ‘Thinking Fast and Slow’ he explains how our brains work using two systems.
System one is your fast, intuitive and emotional reactions to things. It's how you know that two plus two equals four, it's how you can walk home without really paying attention to where you're going.
It's all the intuitive decisions that you make on a day to day basis.
It's easy and involves making mental shortcuts.
System one isn't bad
System one is what allows you to operate and actually function in the world.
System two, on the other hand, is the more slow, conscious and effortful thinking we do when making big, infrequent decisions.
Unconscious bias occurs when we should be using system two but instead fall back on using system one.
We rely on mental shortcuts and quick-fire associations to make decisions that should be more conscious and methodical.
Interested in unpacking this a little further? Take a look at our comprehensive guide to unconscious bias.
What is Bias Training?
Bias training seeks to raise awareness around the mental and cognitive shortcuts that cause us to quickly make (often misguided) assumptions about a person’s ability or character. It usually involves a series of exercises and assessments designed to identify examples of biased thinking. The Holy Grail of these tests is the Implicit Association Test, spearheaded by Harvard University.
Bias training dates back to the 1960s, a period defined by social movements where many of our current anti-discrimination laws came to pass. Now, you would think that it was civil rights activists who called for the introduction of bias training.
However, the unfortunate reality, as pointed out by Tidal Equality, was that it was actually a corporate response to anti-discrimination law, helping companies (who previously had free reign to discriminate) avoid legal action.
Here’s an official definition from the UK Human Rights Commission:
“UBT are sessions, programmes or interventions that aim at raising awareness and/or teaching methods to alleviate unconscious (implicit) biases. These biases are views or opinions toward other people that we are unaware of and that are automatically activated and frequently operate outside our conscious awareness.”
So if you think about the examples of unconscious bias we just looked at, the training is actually only aiming at dealing with a very specific type of unconscious bias - and that's stereotype bias - the stereotypes and assumptions we bring about particular groups.
There are over 200 or so cognitive biases that can interfere with recruitment processes and people's decisions.
But unconscious bias training doesn’t tackle most of them.
It's not dealing with things like ordering effects - like the peak-end effect (we remember the first and the last people we interview, we don't seem to remember the people in the middle) or recency bias.
What does unconscious bias training look like?
So, let’s take a look at what bias training actually entails…
The standard technique is to start with an unconscious bias test.
This is a test which will tell you how much implicit bias you have and how many of these stereotypes or hidden assumptions you have in your psyche.
The Harvard Unconscious Bias Test, for example, will ask you to look at white faces and black faces.
You’ll then have to match the faces with various concepts - positive and negative.
How you match the faces to the concepts will determine how much implicit bias you have.
The rest of the training tends to look like this...
- Unconscious bias 'test' (e.g. Implicit Association Test)
- Unconscious bias 'test' debrief - explaining to participants what the results mean
- Education on unconscious bias theory
- Information on the impact of unconscious bias
- Suggested techniques for either reducing the level of unconscious bias or mitigating its impact
How much does diversity training cost?
The diversity training market was recently valued at nearly $8 billion annually. Google alone spent $114 million on diversity-related programs in 2014. Overall, according to Time Magazine, almost 20% of companies in the United States offer unconscious bias training.
A 2017 survey found that 35% of hiring decision-makers in the UK intended to increase their investment in diversity initiatives. It’s an undeniably lucrative market. But the question remains whether diversity training actually works.
On the surface, the C-Suite will eagerly buy into unconscious bias training, not least because a diverse and inclusive workforce can boost the bottom line. However, last year a study was carried out by the University of Pennsylvania’s Wharton School. The study sought to measure the results of diversity training. Unconscious bias itself cannot be quantified. But, the study did aim to see whether the training had a lasting impact.
Evaluating unconscious bias training exercises
Before looking at what the research has to say, let’s remind ourselves of what unconscious bias training actually claims to do.
The first is raising awareness - just making people aware that they have biases and that discrimination is a problem in the workplace.
Then there's reducing implicit bias.
And there's also addressing explicit bias.
Implicit biases are the stereotypes and assumptions we don’t even realise we have.
Explicit biases are the very much intentional and open prejudices people may have.
The last claim that bias training makes is around changing behaviour.
Reducing implicit bias
When it comes to reducing implicit bias, the evidence is mixed.
The most positive evidence comes from a 2012 study.
Researchers found that there was some reduction in racial preference after eight weeks, according to an implicit bias test with 91 participants.
However, a meta-analysis of 426 studies (involving more than 80,000 participants) found that although there was a reduction in bias immediately after training (albeit a very slight reduction), this disappeared over time.
In another study, over 10,000 employees from a large multinational company were invited to participate in “inclusive leadership workplace training.” The 3000 respondents were put in one of either two training groups and a control group. The training was a 68-minute long online course combining gender bias training with general bias training.
Wharton followed up from the training exercises in two ways. First, they sent out a survey to gauge ‘attitudes’ towards diversity. They then began measuring responses to a series of unconnected workplace initiatives. These included nominating a fellow employee to be recognised for their performance, volunteering to mentor other employees, and offering guidance to new hires.
The study found that there was a measured shift in both attitudes and behaviours amongst women and ethnic minorities. However, there was only a marginal change in white men.
It seems people who may feel diversity was a personal issue felt more compelled to implement what they had learned. The results seem to justify any scepticism about unconscious bias training effectiveness.
Conclusion: it seems that training can reduce implicit bias in some people for a period of time - up to 8 weeks - but effects or temporary, and implicit bias is never reduced to zero.
Reducing explicit bias
So how about the stereotypes that people hold consciously and actively?
Well, when it comes to reducing explicit bias, training has no effect.
In fact, there are reports of increased animosity, anger and resistance.
In one study, white subjects read a brochure critiquing prejudice towards black people.
When they felt pressured to agree, the reading actually strengthened bias against black people.
Another study looking at gender stereotypes in STEM found that whilst training had some impact on the implicit association, it had no effect on men who explicitly held stereotypes pre-training.
Conclusion: unconscious bias training is definitely not the way to tackle explicit bias and can even make things worse.
And now for the most crucial part - does unconscious bias change behaviour?
Does it have any tangible, real-life impact?
Despite having been around since the ’60s, bias training’s effect on observed behaviour just hasn’t been measured enough.
There has been some research on its effectiveness in terms of behaviour change, so let’s dig into what we have…
A study of 829 companies over 31 years showed that bias training had no positive effects in the average workplace.
This study looked specifically at the progression of women, progression of minorities and the percentage of women and minorities in managerial positions.
It’s not looking directly at if/ how behaviours changed as the result of the unconscious bias training... because no one's been measuring that.
However, what has been shown is that despite having extensive unconscious bias training, nothing substantial has happened in terms of diversity/progression.
What the evidence does suggest is that simply making people aware of their biases doesn't actually change their behaviour.
If we look at the biggest study on behaviour change (the chart above), you can see that the companies focused on diversity training actually saw less diversity in terms of black women in management positions.
Conclusion: there's not much evidence that unconscious bias training works, although there is evidence that shows it has little to no effect on outcomes.
Unconscious bias training could even backfire
Bias training can actually backfire in lots of ways.
When we looked at explicit bias, we saw how people can react badly and become more resentful when confronted with their biases, but there are more subtle effects at play too…
Moral licensing is when we believe we can balance out our less moral actions because we have been good in the past.
People who've been given vitamins are more likely to smoke and skip exercise that evening - because you've done a good thing (taken vitamins), you feel like you earned your bad thing (smoking).
A study found that when given the opportunity to endorse Barack Obama (back in 2008), people were then more likely to discriminate against African Americans.
And when subjects are told that their employers have pro-diversity measures, such as training, they presume that the workplace is free of bias and react harshly against claims of discrimination.
So when it comes to bias training, there’s a strong possibility that participants will feel that they’re actually less susceptible to it - when the idea of the training is to show people that they are in fact prone to bias.
Given that people generally tend to rate themselves as less biased than others (a concept known as The Bias Blind Spot), training could cause people to make more decisions that are contrary to the practices it advises.
How to measure the effectiveness of unconscious bias training?
Since bias training’s impact on behaviour hasn’t been properly measured up until now, how can organisations follow up to ensure that what has been taught is being implemented?
Diversity training is the same as any kind of training. Your workforce will sit through presentations, workshops, exercises and the like. And of course, they’ll be able to reel off all they’ve learnt that day in a tidy recap. But if there isn’t a pressing urge to use all the information they’ve learnt - it will either sit in the recesses of their brain or simply evaporate.
So how can we bridge the gap between awareness and actual behaviour? Well for starters, employees would have to flag up every time they have a thought that could be biased. Then they’d have to ensure that they replace them with more realistic, well-rounded views.
Firstly, in a busy workplace setting, this doesn’t sound particularly practical. Secondly, for this to work in the first place the employee would have to be conscious of their unconscious bias. See? It’s a complete paradox.
Furthermore, if we propagate the idea that unconscious bias training is a magic bullet solution, hiring managers and recruiters could feel as though they don't have to second-guess their decisions.
There is even evidence to suggest that encouraging awareness of unconscious bias could hinder diversity initiatives. A study carried out in 2000 sought to measure ‘effects of thought suppression on evaluations of older job applicants.’
Participants first watched a series of videos about diversity. They were then asked to evaluate a series of job applicants. Those who were instructed to ‘suppress demography-related thoughts’ surprisingly rated older applicants less positively than their younger counterparts. If you’re interested, we recently published a blog post examining ageism and recruitment.
According to the Journal of Organizational Behaviour, the results suggest that ‘instructions to suppress stereotypic thoughts may have detrimental effects [...] if raters are cognitively busy when they implement these instructions.’ They referred to this effect as an ‘ironic evaluation process.’
Does implicit bias training work? Putting it all together
Key takeaways from the research around bias training:
- No rigorous evidence that unconscious bias training can lead to a lasting, significant shift in behaviour.
- Little effect on implicit biases for more than a few days, or eight weeks at the maximum.
- Some evidence that the training can be counterproductive
So, does implicit bias training work? The short answer seems to be no.
However, these findings don’t necessarily mean that bias training is useless and should be completely disregarded.
What they do tell us, is what is actually possible to achieve through training alone.
If you’re thinking about using it as your primary strategy, you might want to consider the expense versus the potential results.
Frank Dobbin - a critic of diversity training in general - pointed out that for Starbucks to actually get everyone trained, they’d need to close 1000 stores for half a day in order to train 175,000 workers (at an estimated cost of $12,000,000).
Starbucks hires 100,000 new workers each year, so they’d need to do a dozen half-day sessions every year.
Starbucks would be reducing implicit bias at a cost of $72 million a year.
And this is how corporations manage to spend $8 billion a year on diversity training!
It's not that bias training isn’t useful whatsoever, it's just not a cost-effective or easily measurable means of reducing bias. There are things you can do significantly cheaper that will have a far greater impact.
Unconscious bias training doesn’t work because it’s human nature - it has to be removed by design
We can all agree that workplace discrimination exists. However, no-one wants to admit that they have prejudices.
Training exercises can be a slog at the best of times. We all know how defensive people can get when accused of being biased. If not handled properly, it can be a breeding ground for resentment. This is hardly the ideal foundation for an inclusive workforce.
Picture a group of employees being asked whether they can recall an example where they may have been biased towards someone. It could be based on their race, gender, age or sexual orientation. It is highly unlikely anyone will admit to it, due to what is called ‘social desirability bias’. This occurs when people are asked a question, and give answers that seem 'right' or paint them in a more positive light.
Simply being aware of unconscious bias can only be so effective. While unconscious bias training makes people aware of discrimination and bias, awareness doesn’t fix the issue.
For example, people may be aware that racism exists. However, this doesn’t help them unlearn societal preconceptions about race.
So, if unconscious bias training has so many flaws, what alternative is there to create a more diverse workforce?
Talking about the problem and throwing stats around doesn’t bring about real change.
Even well-intentioned diversity & inclusion professionals are naive in thinking they can change companies by preaching - without changing the system itself.
Instead of simply telling people about their biases and hoping that they actively re-train what is essentially their own nature, why not just design systems that make bias impossible?
That’s where behavioural science comes in.
In short, behavioural science studies human activities to find patterns in the way we behave. When it comes to bias, a behavioural science approach would accept that simply being told not to be biased doesn’t have an effect on how we act, and would instead seek to find a solution that actually changes habits.
Change environments, not people
If you look at examples of how bias has been reduced successfully, it’s not by trying to change people.
It's by changing environments.
If de-biasing someone is in fact possible via training alone, it would take consistent and long-term effort, perhaps years!
What behavioural scientists do instead is create an environment which makes the right choice easy through ‘choice architecture’ - the way in which choices are presented to us.
Rather than telling people to eat their 5-a-day or to eat a balanced diet, for example, you’d give them a bowl like the one above, which divides up the portions.
You're not trying to change what people want to eat or when they want to eat it, you’re just designing an environment that makes it easy for them to make the right choice.
Another example of environment-changing involves pensions.
Instead of trying to get people to save for their pensions with simple messaging like adverts, you make it easy for them to save by making pension-paying the default option.
Whereas you’d usually opt-in, people here in the UK now have to opt-out.
As a result, people paying into pensions has risen from 49% to 86%.
The most famous example of choice architecture - and perhaps the most relevant to unconscious bias training - involves an orchestra.
Back in the 70s, researchers managed to double the number of women getting through auditions by introducing blind auditions.
Since de-biasing the reviewers would take years of training, researchers changed the audition process itself.
So, when it comes to removing bias, it's much better to rearrange the environment in which people make choices, rather than attempting to change human nature.
A more effective way of removing bias from hiring...
Here at Applied, we’ve built a blind hiring platform that makes it nearly impossible for bias to affect decision making.
Rather than just tell hirers to try and be completely objective, why not anonymise applications altogether so that they have no choice but to be objective?
By removing unconscious bias from your hiring process, diversity will naturally improve - no training needed!
Bias starts at the screening stage
The typical screening process is rife with unconscious bias.
The more we know about a candidate, the more grounds for bias there is.
According to a comprehensive study of hiring outcomes here in the UK, candidates from minority ethnic backgrounds have to send 80% more applications to get the same number of callbacks as a White-British person.
This bias doesn’t just apply to ethnicity - your gender, social class, age etc can all affect your chances of being hired or promoted (you can skim our Recruitment Bias Report for more studies like the one above).
The problem with unconscious bias training is that it looks to de-bias people… but why not just remove the information that triggers bias in the first place?
Anonymisation is a solid first step towards de-biasing hiring, but this alone probably isn’t going to drastically improve diversity if you’re still using CVs.
Why? Because even once someone’s identity is hidden, the rest of the information on their CV still paints a vivid picture of their background (but not their ability).
Below you can see just some of the biases triggered by a CV.
Whilst removing obvious identifying information may reduce some biases, we know that those from minority backgrounds have a harder time in the hiring process and may therefore be less likely to have experience at well-known organisations.
By over-indexing on education and experience, you’re likely to find the same diversity gaps being perpetuated, even with an anonymous process.
Using work samples instead of CVs
So if we’re ditching CVs, what should we be using instead?
Well, take a look at the results of this metastudy below, testing the predictive validity of assessment methods...
As you can see, the most predictive assessments are known as ‘work sample tests.’
Spoiler alert: this is what we use at Applied and we’ve found that 60% of people hired through this process would’ve been missed via a typical CV screening.
Work samples are similar to situational-style interview questions, except they pose scenarios hypothetically - focussing on potential rather than potential.
A CV looks at someone’s background, and hirers attempt to guess whether or not this background would foster the necessary skills.
Work samples, on the other hand, test for skills directly.
Sure, decades of experience might mean that a candidate gives the best answers, but this isn’t always the case. We’d rather give candidates a chance to showcase their skills rather than discounting them based purely on their background.
The idea of work samples is to simulate the job. You’re essentially asking candidates to either perform or explain their approach to a task that would likely occur should they get the job.
What could be more predictive than getting candidates to think as of they were already in the role?
Here’s a real-life example we used for an Account Manager role:
You've been given 50 accounts to manage, ranging from newly onboarded users to long term customers of Applied. They vary in terms of size, industry and knowledge of the Applied platform.
Your manager asks you to come up with a plan for how you will focus your time to maximise the growth of your accounts over the next 6 months.
What things would you consider in putting this plan together? How would you measure your success? Is there any other information you would need?
Skills tested: Accountability, Prioritisation Data-driven
As you can see, this work sample question is very much reflective of what the job would actually entail. Whilst those with experience may be best equipped to answer this, it doesn’t necessitate having a specific background, if you have the right transferable skills, you can give a great answer.
To create your own work samples, follow these steps:
- Decide 6-8 cores skills required for the role.
- Think of a realistic scenario or task that would test at least one of these skills.
- Pose the scenario hypothetically, asking candidates what they would do.
You can see more examples via our Work Sample Cheatsheet.
Once submitted, we put the work sample answers through the process below to eliminate lesser-known biases like ordering effects.
Chunking: Each application is sliced up by the question. Instead of looking at all of Candidate 1’s answers, then Candidate 2’s, you’d look at every candidate answer to Question 1, then 2 etc (this is to avoid the halo effect).
Randomisation: Now you’re reviewing question by question, we’d recommend randomising the order they’re viewed in for each question since we know application viewed first tend to be favoured (recency bias).
Scoring criteria is essential
When it comes to reviewing work samples, you’ll need a set of criteria for each question.
This doesn’t need to be anything too detailed, we’d recommend a simply 1-5 star scale, with a few bullet points describing what a good, mediocre and bad answer would include.
Here are the criteria we used for the work sample above:
- No reference to different ways to segment customers
- Not clear what success measures would be
- Reasonable suggestions for segmenting customers but too focused on one dimension (e.g. MRR without a different strategy for a brand new vs tenured customers)
- Success metrics focused only on increased revenue and not thinking about churn or customer satisfaction
- Understands there are multiple ways to segment a book of business and acknowledges that current value may not match future potential
- Suggests success metrics with multiple dimensions e.g. revenue growth, limited churn and customer satisfaction
Criteria ensure that candidates are being scored against the actual skills required for the role, rather than the reviewer’s own biases.
It also means that you can get other team members involved in the hiring process.
To average out the biases of individual hirers, have three team members review candidates’ work sample answers.
Not only is this an effective means of removing bias, but it’ll also give you more accurate scores due to a phenomenon called known as ‘crowd wisdom’ - the general rule that collective judgment is more accurate than that of an individual.
P.S. you can read up on our interview process here.
The Applied platform eliminates unconscious bias in recruitment. Based on the concept of blind hiring, Applied’s purpose-built recruitment software anonymises candidates so they are judged purely on merit. Furthermore, the software helps companies write inclusive job ads, to guarantee a rich and diverse pool of applicants.
Applied has already helped some of the UK’s most well-known companies streamline their hiring process, from ASOS to Penguin Random House. To find out more about how it can transform your organisation, request a demo today.