Structured Interviews: How to Data-Proof and De-bias Interviewing

Published by:
Joe Caccavale
April 27, 2021
9
min read

Interview Question Playbook

FACT: Structured interviews are fairer and more predictive than unstructured interviews.

Forget about trick questions, informal chats and ‘recruiter instincts’...

This is how to turn your interviews into a data-backed science.

Below we’ll cover:

  • Why you should be using structured interviews
  • How to make interviews less biased and more inclusive 
  • How to introduce data into your interviewing 

Structured interview definition

A structured interview is where all candidates are asked the same questions in the same order.

Interviewers will have a prepared set of questions to ask candidates.

The idea is to make interviews standardised so that apples can be compared to apples.

Unstructured interview definition 

Unstructured interviews are more casual and free-flowing.

Interviewers will explore tangents so that each candidate’s interview experience will be slightly different.

Candidates could be asked completely different questions based on the direction the conversation takes.

Why use structured interviews?

Structured interviews are more predictive of ability 

The first reason to make the switch to structured interviews is a pretty straightforward one: they’ve been proven to be more effective than unstructured interviews.

Below are the results of a famous metastudy on the predictivity of assessment methods.

As you can see structured interviews were found to be significantly more effective than unstructured interviews.

Predictive validity of assessment methods (chart)

Why is this?

Well, by making interview techniques more uniform, it makes it easier to objectively compare candidates.

By knowing what you’re looking for before the interview starts, you’re able to identify who meets the criteria and who doesn’t.

It’s also fairly safe to assume that unstructured interviews involve a degree of irrelevant questioning regarding candidates’ education, experience and interests - which you’ll also notice ranking at the bottom of the chart above.

Unconscious bias plays a major role in unstructured interviews

A common misconception about bias is that it’s something perpetrated by a minority of ‘bad apples’.

Whilst some hirers may be more biased (or just more explicitly biased) than others, the truth is that we’re all biased.

It’s an unavoidable part of being human.

Here are just a handful of biases that could affect interviewing:

Stereotype bias: when we believe certain traits are true of a given group of people.

Confirmation bias: we naturally look for information that confirms what we already believe.

Halo effect: a good first impression can heavily distort how we remember an experience.

Contrast effect: we look to compare things rather than judge them on their own merit

You can read more about interview bias here.

The more structure you add to your interviewing process, the more bias you can remove.

If unstructured interviews delve into people’s backgrounds and are largely go-with-the-flow affairs, then it’s not hard to imagine how interviews might take a shine to certain candidates based on their biases, rather than the candiate's suitability.

Did the candidate go to the same university as you?

Did they make a good first impression?

Did you interview them at the start of the day, on a full stomach?

Did you find them attractive?

All of these factors can and often do affect how we perceive candidates.

Whilst not all of these can be solved with a structured interview alone, we can take steps to design away much of this bias...

How to conduct structured interviews 

Decide on the skills you’re looking for

This step sounds simple but is absolutely critical.

To make fair, data-driven hiring decisions, you’ll first need to define what you’re going to be testing for.

Usually, hirers might outline some vague characters or experience requirements they’d desire, this isn’t going to cut it!

Instead, simply jot down 6-8 core skills needed for the role.

These can be a mix of hard and soft skills (eg SEO and strategic thinking).

This distinction is important because you want to be defining a skillset rather than a type of person.

In the latter case, candidates who don’t ‘fit the bill’ are likely to be unfairly overlooked.

If you to each interview question to one or two of the skills you’re looking for, you’ll be able to map out each candidate’s strengths and weaknesses over the course of the process. 

Ask questions that test skills, rather than background

A structured interview will only get you so far if you’re asking the wrong questions.

Traditional interview questions generally revolve around candidates’ past work experience and what they believe their strengths to be.

The root issue with unstructured interviews is that they don’t test candidates’ skills, they just ask about them.

Skills should be the only thing that matters.

Not where someone studied, where they’ve worked or what they do with their weekends.

Whilst these things could have an impact on a candidate’s skills, these alone can’t tell you who the best person the job is - they’re just proxies for the skills you’re looking for.

Here are the three types of question we ask candidates here at Applied:

Work sample questions

Work samples aren’t too dissimilar from your typical ‘tell me a time when…’ interview question.

The key difference is that they’re posed hypothetically.

You simply explain the context of the scenario and ask candidates how they’d respond.

This enables candidates who haven’t experienced a given scenario before to showcase their skills.

It also acknowledges that some candidates may handle something different today than they would’ve in the past.

Work samples can be based on both dilemmas or everyday tasks.

The best work samples are those which are the most reflective of real life.

An example work sample for a sales role:

Question: You haven't had a sale in x months but you know it's just been an unusual period and you think you're in a good place for the future. Your sales manager wants to help you look at your pipeline and activity levels. What do you do?

Case studies 

Case study interview questions give candidates a bigger task to think through.

This is an opportunity to present candidates with a real (or near enough real) project that they’d actually be working on.

After giving them the context, you can ask candidates a series of follow-up questions to see how well they understood the task and what their approach would entail.

Although you’re in a structured interview, you can still ask additional questions to help candidates explore their ideas - just make sure every candidate is afforded this help.

Below is a case study we used for a Digital Marketer role:

Question: Below is some fake data to discuss. ​​To meet our commercial targets we think we need to increase our ​demo requests​ from 90/month to 150/month. Below are some fake funnel metrics and website GA data. With a view to meeting this objective, talk through the above data and what it might mean.

Follow-up 1: What additional data would you need to work out how to meet the objective?

Follow-up 2: Given the objective, where would you concentrate your marketing efforts? Is there anything that you would do immediately? Where is the worst place to spend your time, given what you see in the data?

Job simulations

Job simulations take parts of the role and ask candidates to perform them.

Whilst not appropriate for every role, job simulations are well suited to client-facing roles that require skills that are hard to test via regular interview questions.

Here are a few examples of job simulation tasks we’ve used:

  • Client meeting
  • Presentation
  • Sales pitch
  • Case study interview

As you can probably tell, these types of questions are designed to test purely for skills.

Industry-specific experience is valuable, but these questions test for skills learned through experience rather than hiring people for their experience alone.

This means that talented candidates from outsider backgrounds get a fair chance to show off their skills, as well as minority background candidates who struggle to gain experience due to hiring bias

Make sure you have scoring criteria

This is where data comes in!

To make data-based decisions, you first need to actually collect data.

For each interview question, score candidates answer out of 5.

To make sure you’re staying as objective as possible, we recommend writing a rough and ready review guide for each question.

This needn’t be any more in-depth than a few bullet points describing what a good, mediocre and bad answer would include.

Here’s the review guide we used for the sales interview question above:

Review guide

1 star

* Incoherent or defensive answer

3 star

* Open to discussing with manager

* But ultimately they know best

5 star

* Completely open to coaching and feedback

* Recognises that the 'unusual period' is probably not just that

* Is transparent on workload, leads etc.

Using a review guide enables you to overlook elements of a candidate’s personality or identity that you might otherwise be put off by, albeit subconsciously.

It also ensures that each answer is judged on its own merit so that a good or bad answer early on doesn’t overshadow the rest of the interview.

You can still make small talk with candidates to put them at ease and answer any questions, but scoring in this fashion will ensure that this doesn’t impact a candidate’s chances of being hired.

At the end of each interview wound, you’ll be able to either add up or average out candidates’ scores to build a leaderboard. 

You can grab our free Interview Scoring Worksheet here.

Try three person interview panels 

Crowd wisdom is the general rule of thumb that collective judgement is more accurate than that of an individual - two heads really are better than one!

For the most accurate interview scores, we’d recommend having three interviewers (any more and that and you’ll see diminishing returns).

At Applied, we have a new panel for each interview round, so that any biases of a single interviewer are averaged out over the course of the process.

The more diverse your panels, the more objective the scores will be.

If you’re scoring against criteria, you can get team members from other functions to help score candidates - we’ve found this diversity of perspective adds real value.

Being interviewed by three people can be a daunting prospect.

Interviews are scary enough as it is.

So, the best practice would be to divide the interview questions among the interviewers, this way there are no silent judges ominously staring at candidates.

Forget about culture fit

For most hires, the casual format of unstructured interviews enables assess a candidate’s ‘culture fit’.

Does their personality fit in with the company’s culture.

The problem is - culture fit testing is often used as a smokescreen for biased decision making.

How well you’d get on with someone shouldn’t matter when it comes to hiring decisions.

This is how we end up with organisations - and even whole industries - being dominated by a handful of demographics.

If we look at culture as being something fixed, then we’re likely going to hire people who look and sound like those who are already in the organisation… and so diversity gaps will continue to grow.

Testing for mission and values alignment is a fairer, more objective means of assessing whether or not someone will be a good ‘fit’.

If you treat this like a skill, you can test it using work sample-style questions like you would any other skill.

Simply lay out some of the context around where your organisation is at and where it’s heading and ask candidates if they’re comfortable with this.

You could also ask candidates to explain why they’re interested in working at your organisation to see how bought into your mission they are.

Whilst not a world away from traditional, unstructured interviews, by scoring against criteria, you’re making sure candidates are being judged on their answers, rather than ‘gut feeling’.


We built Applied to assess candidates purely on their ability to do the job and remove unconscious bias from the hiring process to find the best person for the job, every time.

Our blind hiring platform uses behavioural science to make the process as predictive and fair as possible, using research-backed assessment methods to source, screen and interview. To find out how all this is done, feel free to request a demo or browse our ready-to-use interview questions and talent assessments.