Stereotype Bias: Need-To-Know Insights for Fairer Recruitment

Published by:
Joe Caccavale
December 8, 2020
5
min read

Debiased Hiring Bundle

Asians are good at maths.

Men can’t multitask.

Blondes are unintelligent.

Stereotype bias is nothing new. Whilst most of us recognise some of these stereotypes as being just that - stereotypes (and not hard facts), that doesn’t mean that they don’t affect our everyday perception and decision-making.

What is a stereotype?

Stereotyping is an over-generalised belief about a particular group of people.

We assign a fixed set of attributes that we believe to be typical of that group.

This way of thinking about stereotypes was born out of Katz and Braly’s research.

100 Princeton University students were asked to select the attributes that they associated with ten specific nationalities, ethnic and religious groups.

They found that the degree of agreement among the students was too high to be purely down to the students’ experiences with these groups.

They concluded it was much more likely due to commonly held stereotypes.

Katz and Braly referred to this as “a group fallacy attitude.”

Stereotypes dictate how we see others

Stereotype bias plays a significant role in how we perceive and characterise others.

In one US study - by Fiske Cuddy, Glick and Xu (2002) - perceptions of ‘out-groups’ were measured against warmth and competence.

You can see from the chart below that we have a fixed idea of how warm, friendly and competent we think someone is just by what they look like. ‍

Stereotyping's effect on perception study


Participants viewed homeless people as being low warmth and low competence and Christians and middle-class people as warm and highly competent.

And then there are groups such as men, who participants perceived as being highly competent, but not particularly warm.

Whilst these results will vary depending on whereabouts in the world they’re being conducted, you can see that we make huge, sweeping assumptions about people based on the social group in which they fall.

In another study, participants were found to react quicker to an association between “White” and positive attributes, such as “smart”, compared with the pairing of “Black” with the same positive attributes.

What’s interesting about this study is that participants self-reported as not having many prejudices.

How can this be?

Well, at least some of our stereotype bias is unconscious.

We might not think that we’re subject to bias, but that doesn’t make it true.

A quick word on in-group bias

It’s safe to assume that the studies above were conducted with predominately white participants, but if you presented a different demographic with the same test, we’d expect to see very different results.

This is due to a bias called ‘in-group favouritism’, or simply, ‘in-group bias.’ 

We favour those who are in our in-group over those in an out-group.

Similar to the findings above - a 2001 study showed that the fusiform face area (the part of the human visual system we use for facial recognition). responded more strongly to same-race faces compared to other-race faces. 

However…

A later study on the same part of the brain found that when White participants were randomly assigned to a mixed-race team, they showed more FFA activity for the in-group team - regardless of their race.

Whilst in-groups often revolve around race, as the study above shows, this isn’t always the case.

We simply favour people who we see as ‘one of us.’

We will subconsciously look more positively upon those who we feel we can trust.

When it comes to implicit biases like this, it doesn’t make us bad people.

It means that we’re human.

That being said, stereotyping and in-group bias in recruitment can have real-life consequences.

Stereotyping in the workplace 

As we saw from the chart above, we judge how competent someone is based on stereotypes.

Once you get to know someone better through working with them, you’re likely to see them more as an individual, but some of this stereotype bias remains.

As a result, people can be overlooked when it comes to both recruitment and promotions.

Whether we like to admit it or not, unconscious bias in hiring is commonplace.

We have biases around how candidates look, how they sound, where they went to university and where they worked previously.

Most hirers wouldn’t like to think of themselves as being prone to stereotype bias, but the truth is that we’re all open to bias - whether we're aware of it or not.

Whilst someone’s perceived warmth can impact their chances of being hired, stereotype bias works in another way too… it plays a role in dictating the types of people we expect to see in a given role.

Stereotypes are bred into us as children and continually reinforced throughout our adult lives.

This is what we see when googling ‘secretary’...

This is what we see when googling ‘developer’...


When we hire for a given role, we have a tendency to look for someone who ‘fits the bill.’

But what a typical secretary or developer looks like is often a result of stereotype bias.

Since we generally think of developers as men, we’re likely to subconsciously look for a man to fill that role, since thats what the stereotype tells us a developer should look like.

Effects of stereotyping

Stereotype bias has very real implications for recruitment.

Candidates from minority backgrounds end up being disproportionately overlooked.

Below are the results of three studies around race discrimination.

In each study, identical CVs were sent out to multiple employers, with only the name changed.

The result: Candidates with non-native sounding names receive fewer callbacks.

And it’s safe to assume that the stereotypes we have around ‘out-groups’ have a fairly large part to play in this.

Bias in the screening process studies
Sources: Weichselbaumer (2016), Edo et al. (2013), Wood et al (2009), Zschirnt & Ruedin (2016), Baert (2015)

How to avoid stereotyping when hiring

The most effective way to remove stereotype bias from recruitment is through a blind hiring process.

Blind hiring works by removing all identifying information from applications so that candidates are assessed anonymously.

Here at Applied, we took this a step further by replacing CVs with work samples - a less biased and more predictive method of assessing skills.

You can get the lowdown on how we’ve removed bias from hiring via our resource below, but here’s how it works at a high level...

Screening

Candidates anonymously answer four ‘work sample’ questions that are specifically designed to test the skills needed for the role by simulating parts of the job.

When using the Applied platform, these answers will be grouped by question and randomised to avoid ordering effects.

Answers are scored by three team members against pre-set criteria.

Interviews

Although candidate will no longer be anonymous, questions are forward-looking (posed hypothetically/ not asking about candidates’ experience) so that candidates are tested on their skills, not just experience.

Again, candidates are objectively scored by multiple interviewers - and scoring is done independently.


We built the Applied hiring platform to eliminate bias using behavioural science research. Curious? Feel free to book in a demo.