Interview techniques for employers: what works (according to science)

Joe Caccavale

20

April

2021

9

min read

|

X

Sign up to our newsletter

Want to see more content like this, every other week?
We're on a mission to change recruitment.

Sign Me Up

WARNING: We will not cover any generic, wishy-washy interviewing techniques below.

All of our recommended interviewing techniques are based on behavioural science research.

Here at Applied, we’re on a mission to de-bias and data-proof hiring. 

When it comes to recruitment, we care about two things…

  1. How fair is our process?
  1. How predictive is our process?

Traditional interviews tend to be awash with unconscious bias, and lack in predictive power.

By following the interviewing techniques below, you’ll be able to reliably pick out the best person for the job - and have the data to prove it.

The research around hiring methods

Before we go any further, it’s worth looking at the data round interviewing techniques for employers.

Predictive validity of interview techniques


We’ve cut out some of the less relevant assessment methods, but would strongly recommend taking a look at the metastudy for yourself.

As you can see, unstructured interviews and background information like education and experience don’t tell us much about a candidate’s ability.

So, now we know what works and what doesn't, here's what the research says we should do to make sure we're hiring the best people...

Structure your interviews

To conduct structured interviews, you’ll need to ask all candidates the same questions in the same order.

If you’re using traditional, unstructured interviews, chances are each interview will be fairly different from the next, and so will be difficult to objectively compare.

By making your interviews as uniform as possible, you’re essentially ensuring that apples are being compared to apples.

Structured interviews are not only more predictive than unstructured interviews (see chart above), but they’ll also be fairer.

If you decide to either grill or explore the background of one candidate more than another, then this could easily affect your perception of how the interview went.

Ask forward-looking, work sample questions questions

As you can see from the metastudy above, work sample tests are the most predictive assessment method you have at your disposal.

Work samples take parts of the role and turn them into tasks/questions by asking candidates to either perform them or explain their approach to performing them.

The idea is to simulate the job itself.

Using this style of question, you’re able to tap into candidates’ potential, rather than relying on proxies like education and experience (which we’ll cover shortly).

Work samples test for skills learned through experience, rather than for experience itself.

Here’s a work sample-style interview question we used for an Account Manager role:

You’ve been asked to put together a report on the Account Management function for the rest of the business to showcase the work of the team and its role in the wider organisation. What might you include in this report and why do you think it would be valuable to share this information with others?

By phrasing questions hypothetically (or ‘forward-looking’), these questions enable candidates who have the right skills but not specific experience, to shine. 

Work samples aren’t too dissimilar to your typical ‘tell me a time when’ questions, except they don’t require to have encountered the given scenario before.

That doesn’t mean that those who have tacked it before wouldn’t give the best answers, but we’d rather test this than take their word for it, so that everyone gets a fair chance.

You could also use interviews to work through a case study.

Here at Applied, we often use an interview round (or at least part of one) to give candidates a chance to work through a bigger task.

This usually consists of a task like the one above, except with follow up questions.

Here’s how we did this for a Digital Marketer role:

Q1. Below is some fake data to discuss. ​​To meet our commercial targets we think we need to increase our ​demo requests​ from 90/month to 150/month. Below are some fake funnel metrics and website GA data. With a view to meeting this objective, talk through the above data and what it might mean.

Q2. What additional data would you need to work out how to meet the objective?

Q3. Given the objective, where would you concentrate your marketing efforts? Is there anything that you would do immediately? Where is the worst place to spend your time, given what you see in the data?

Forget about candidates’ backgrounds

The more we know about a candidate, the more grounds for bias there is.

Unconscious bias affects everyone but left unchecked, can lead to minority groups being overlooked.

Our brains naturally look to take mental shortcuts and make quick-fire associations.

Where a candidate went to school, their ethnicity or even their accent can have trigger biases that influence decision-making.

Whilst we can’t anonymise interviews as we can screening, we can take the emphasis off of people’s backgrounds, especially since we know these things have little to do with ability.

Avoid interview questions about:

  • Education
  • Years of experience 
  • Interests

Whilst irrelevant questions about someone’s hobbies or travelling history might seem like a gentle way to get to know candidates, they do more harm than good.

Affinity bias is when we feel as though we have a natural connection with people who are similar to us and can often creep into interviews.

This means that we generally tend to favour candidates to who we feel we have a ‘connection’.

This is just the tip of the iceberg, you can read more about interview bias here.

Make sure you have scoring criteria

Before you even write a job description, you should decide on the core skills you’re looking for.

For interviews (or any part of the hiring process) to be objective, they need to be tied to fixed criteria.

Every interview question should have its own ‘review guide’ so that it can be scored out of five (this just happens to be the scale we use).

For each question, make note of what a good, mediocre and bad answer might look like.

Here’ what this looks like in practice:

Question

Explain what Applied is to us, and why a Talent Team might use it.

Review guide

1 Star 

Either pretty incoherent or not well researched.  

3 Star

Slightly less clear concise, and right in about 60% of the details

5 Star 

Clear and concise with about 80% of the broad details right about what Applied is and the benefits

Why is scoring criteria so important?

Well, it ensures that candidates are being judged on their answers, not their personality or credentials.

And we also know that in addition to biases around people’s backgrounds, we’re also influenced by ordering effects.

For example, we’re more likely to remember both the intense and final moments of an experience more vividly.

A phenomenon known as the ‘peak-end effect’

One study put this to the test - asking participants to take part in two trials which involved putting their hand into uncomfortably cold water.

Trial 1: Place hand in 14° water for 60 seconds.

Trial 2: Place hand in 14° water for 60 seconds and then 15° water for 30 seconds.

Participants reported that despite being uncomfortable for longer, they found Trial 2 less painful.

Why? Because the end of the experience was less painful (and the peaks were the same).

Peak-end effect


If you conduct interviews without scoring as you go, you’re more likely to make inaccurate decisions based on these sorts of mental shortcuts.

A candidate that ends the interview on a high, for example, could be favoured over one who was more consistent but who ended on a lesser note. 

Note: small talk is absolutely encouraged - it’s important to make candidates feel at ease so that they can perform at their best - just make sure you’re scoring answers to the pre-set questions and not the small talk.

Have 3 interviewers 

On the face of it, having three interviewers sounds intimidating.

However, this step will actually make your interviews fairer and more accurate.

Crowd wisdom is the general rule that collective judgment is more accurate than that of an individual.

If one interviewer has a bias towards/against a candidate, this bias should be averaged out by the scores of the other interviewers.

The best practice here is to have a new three-person panel for each interview round.

This is also why scoring criteria is so essential - because it enables you to get others involved in the hiring process. It’s worth noting that the more diverse your interview panel is, the more objective the scores should be.

At the end of the interview stage, you can simply average out interviewers’ scores to build a candidate leaderboard like the one below (our screening process is anonymous, hence the lack of names).

Candidate leaderboard


Assess mission/values alignment instead of culture fit

Hiring for ‘culture fit’ can be problematic.

Whilst you want to hire someone who embodies your organisation’s values, culture fit is often used as a smokescreen to discriminate against certain candidates.

Even well-intentioned hires can make the mistake of putting too much weight on culture fit.

If your organisation is made up of a certain demographic, this will generally shape the ‘culture’ and therefore anyone who doesn’t fit this mould may be overlooked.

Whilst you could switch to ‘culture add’ (what someone can add to your culture), it’s more effective to simply move away from culture testing completely.

At Applied, we test for mission and values alignment.

Do candidates care about our mission?

Do they appreciate our values?

You can turn this into an interview question by asking candidates why they’re applying and what they’re hoping to get out of the role - just make sure you have scoring criteria like you would for the work sample questions.

Bonus tip: ignore bad advice around ‘standard practices’

Most interview techniques for employers you’ll see are based on antiquated, biased practices.

Of course, being an effective hirer takes skill and practice but ‘gut instinct’ is not one of those skills.

More often than not, this gut feeling is an unconscious bias, pulling you towards or away from a candidate.

Here are a few tips we’ve seen that you should definitely not add your arsenal...

Forget about taking notes: Most of us don’t have photographic memories. When we recall past events, our biases tend to fill in the gaps. Taking notes is not rude, so long as you let candidates know why you’re doing so (for the sake of fairness)

Analyse body language: Only one thing should matter in an interview - does the candidate have the skills to do the job. Unless you’re an FBI interrogator, we’d advise against paying any attention to body language… interviews are scary enough as it is!

Ask about candidates’ previous jobs: We’re easily wowed by a big name company and flashy titles, but they don’t make someone the best hire in and of themselves.

Draw on common connections: Mutual interests or similar backgrounds can play a huge part in hiring decisions - but they shouldn’t. This has nothing to do with someone’s ability.

Next steps