Perception bias definition
Perception bias is when our perception is skewed by the stereotypes and assumptions we have about other groups.
Although we may believe we’re being objective, we all have unconscious biases that can affect decision-making - even if we’re unaware of them ourselves.
The science behind perception bias
Unconscious bias is natural.
It’s simply a part of being human.
This doesn’t mean that we don’t need to do anything about it (especially in high-stakes situations like recruitment), but it’s worth pointing out that having implicit biases doesn’t make you a bad person, it makes you human.
So, here’s how unconscious bias occurs...
According to Daniel Kahneman’s ‘Thinking Fast and Slow’, we have two systems for everyday decision-making.
System 1: fast intuitive thinking based on mental shortcuts.
This is how you can find yourself walking to work or the local shop on autopilot.
System 2: slow, considered and conscious thinking.
This is how you’d make big, one-off decisions like at work or when planning your summer holiday.
Generally speaking, System 1 is for those micro-decisions that we have to take every day. If you had to think long and hard about every one of the 1000’s of tiny decisions you make each day, you probably wouldn’t get out of bed at all!
So in terms of day-to-day living, this intuition-based way of thinking is absolutely essential.
However, we often fall back on System 1, when we should be using System 2.
Instead of thinking slowly and thoughtfully, we slip back into our gut-based pattern of thinking and rely on subconscious associations.
And recruitment is a prime example of this…
Perception bias in recruitment
Our subconsciouses are full of stereotypes and assumptions about certain groups.
Most hirers wouldn’t like to think that their judgment is in any way influenced by these biases…
But the reality is - we don’t even know we’re doing it.
Even a candidates' name alone can trigger unconscious biases.
A candidate’s perceived race can affect their chances of getting a callback
In the UK: Inside Out London’s study found that people with a Muslim-sounding name are 3x more likely to be overlooked for a job.
In the US: A 2004 study concluded that candidates with African-American names would need an extra 8 years of experience to get the same number of callbacks as those with white-sounding names.
In Germany: A study found that candidates with a Muslim-sounding name, who were pictured wearing a headscarf, were 15% less likely to receive a callback than their white peers.
In all of these studies, researchers only changed candidates names. The rest of resume was exactly the same.
Although we can’t know for sure, let’s assume that most of the employers involved were not explicitly bias.
If you asked them, they’d probably tell you that their recruitment process was based purely on merit and wasn’t biased…
But the evidence begs to differ.
Whether we choose to admit it or not, we're all open to bias.
And the same applies to gender
Perception bias doesn’t just affect our perception of people from different ethic backgrounds, someone’s gender, for example, can also affect our judgement.
In the US: A study of university science faculties found that when the name on an application was female, job candidates were perceived as being less competent and hireable (the rest of the application was identical).
In Australia: A study of hiring algorithm, discovered that AI favoured male candidates.
In the UK: A UN report found that 25% of people think men should have more right to a job than women.
Stereotypes influence how we see others
So, we know at the screening stage, perception bias (whether we’re aware of it or not) can play a significant role in decision-making.
But even when someone makes it past the initial screening through to an interview, stereotype bias and the ideas we have about different groups could cloud our judgement.
Someone’s voice, perceived class or disability could lead us to attribute certain characteristics to them or make assumptions about their ability.
Here’s how stereotypes can affect our perception of various groups:
Needless to say, these will vary depending on where you are - and which group you yourself are part of.
At this point, it’s worth pointing out that - as you can see from the chart above - perception bias doesn’t just involve judging out-groups more harshly.
We also perceive some people as being more capable or suitable based on unconscious assumptions and connections.
To summarise: candidates can be disadvantaged or unfairly favoured due to their membership of a certain demographic.
What can we do to remove perception bias from the hiring process?
Since unconscious bias is very much a part of the human experience, simply being ware of it and ‘trying to do better’ doesn’t have any measurable impact.
This is precisely why unconscious bias training doesn’t work.
So if you can’t change people, what can you change?
The answer is: environments.
You can’t impact the way a hirer’s brain works, but you can impact the process in which they make decisions.
By designing a hiring process that forces decision -makers to use System 2, we can effectively remove perception bias (as well as most other types of bias) from recruitment.
Here is our process for de-biasing your hiring...
Start ‘blinding’ applications
Since candidates’ names can lead to perception bias, the first step is to anonymise applications.
But it’s not just names that can lead to bias - things like addresses and date of birth can indicate that someone is part of a certain group, and should therefore be removed too.
If someone has the skills necessary to do the job, where they’re from, how old they are or what they do/don’t believe in shouldn’t be a factor.
This is why its best to remove all identifying information from applications, so that it doesn’t creep into our decision-making.
Say goodbye to the CV
Once you’ve removed personal details from an application, you’re left with just education and experience.
But these can also be problematic.
If top universities tend to be attended by those from more privileged backgrounds, is it not biased to favour these candidates?
And then there's also the fact that those who go to top universities tend to get the best jobs (and gain the best experience).
We don’t doubt that for many roles, prior experience might make someone the best person for the job.
But this isn’t always the case.
By looking for candidates with specific experience from a specific handful or universities, you’re essentially narrowing your search down to a single type of person.
So, if you’re going to remove experience and eduction from applications, what should you replace them with?
According to this famous metastudy, the most effective means of assessment is called a ‘work sample.’
Work samples take tasks involved in the role and ask candidates to either perform or think through them as if they were actually in the job.
What could be more predictive than simulating parts of the role itself?
If someone’s eduction and experience makes them the best candidate, then this will show in their work samples.
By asking candidates to answer 3-5 work samples instead of submitting a CV, you’re testing for skills upfront, and removing perception biases around eduction and experience in the processes.
You can grab our (free) guide to work sample here.
Switch to structured interviews
When meeting candidates face-to-face, perception bias is hard to eliminate completely.
You’re still going to see candidates, hear them speak and therefore naturally make unconscious connections.
Our perception bias may lead us to favour a well-spoken candidate, for instance.
That being said, the fairest (and most predictive) way to interview candidates is by asking everyone the same questions in the same order.
This is what we call a ‘structured interview.’
However, our de-biased interview process here at Applied goes a little further than just changing structure.
Whilst your standard interview questions typically focus on past experience, we use work sample-style questions.
Instead of the usual ‘tell me a time when you did x’, simply ask ‘how would you do x.’
Having candidates across a table (or screen) from you gives you an opportunity to see how they’d think and work in the role.
If you give them a work sample or case study to work through (posing them like we just did above), you can see how they’d approach tasks should they get the job.
The best work samples are made by using real-life issues or tasks.
If there’s a project that’s coming up or a problem that your team recently solved, you can use these as a basis for your work sample.
Give yourself scoring criteria
Scoring criteria is essential for a bias-free hiring process.
Each of your work samples and interview questions should have their own set of criteria to score candidates against.
This doesn’t need to be extremely detailed, but some form of objective, pre-set criteria is required.
We’d recommend using a simple 1-5 star scale.
Here’s an example of scoring criteria we recently used for a question around identifying and working with B2B communities:
- Doesn't understand what a B2B community is, or makes little or no effort.
- Understands what a B2B community is and correctly identifies a successful one
- Identifies some plausible reasons it has been successful, but doesn't articulate well or insightfully.
- Doesn't relate the example to our own situation clearly.
- Correctly identifies a B2B community that has some similarities to our own
- Identifies key features which we can emulate and relates to Applied's challenges
- Teases out why they think it has been successful in the USA in particular
- Bonus points - Identifies ways in which we might be different
If you use a basic number scale, you can then add up each candidate's scores at the end of the process.
So instead of hiring who you ‘liked’ the most, you can simply look at who scored the highest.
To get the most accurate and unbiased scores - have two other team members review work samples with you, and another two join you to score interviews.
Crowd Wisdom is the general rule that collective judgement is more accurate than that of an individual.
When it comes to hiring, having others review with you means that any individual biases will be levelled out, and scores should be more reflective of candidates actual ability.
Each reviewer should score work samples/ interview questions independently before discussing them with one another.
We went in-depth on our process for de-biased, data-driven hiring in our free resource below.
Although the Applied Platform automates much of the de-biasing, you can still set up a recruitment process just like ours without spending a penny.
Applied is the essential platform for fairer hiring. Purpose-built to make hiring ethical and predictive, our platform uses anonymised applications and skills-based assessments to improve diversity and identify the best talent.
Start transforming your hiring now: book in a demo or browse our ready-to-use, science-backed talent assessments. We've also used data from over 100,000 applications to rank the top diversity job boards, to help you attract the most diverse range of candidates possible.