Behavioural science can help us understand the psychological, social and environmental factors that affect how we think and act. These insights can give us an edge in complex, high stakes situations – whether we’re developing strategic communications, diagnosing what motivates bad actors, or behaviourally auditing security processes.
One practical and impactful way we’re applying these behavioural science insights at Schillings is to help debias decision making in a crisis. These high stress and time sensitive environments make us more likely to rely on quick, automatic thinking rather than slower, deliberative thought. The result – we’re more exposed to cognitive and behavioural biases, less open to negative feedback, and more likely to defer to decisions aligned with our habits than our goals.
Failing to debias decision making can have major consequences. Psychological and behavioural biases help explain one billion pound overrun of major infrastructure projects to the tragedy of the Challenger shuttle explosion. You can have the best people in the room, but if bias is at play, you won’t get the best outcome.
Here are three ways you can apply behavioural science to debias decision making in a crisis.
1. Develop decision support matrices
In high stakes situations, pressure can limit the bandwidth available for decision making. For example, people given a low-stress task are 50% more likely to pick a healthy snack than people with a high-stress task. In this “hot state”, they’re more likely to rely on quick, impulsive decisions than the longer-term strategic decisions of a “cold state”.
One way to avoid falling into automatic thinking is to develop decision support matrices (DSM) as part of crisis planning.
A decision support matrix sets out all of the parameters that need to be considered for decision-making in a crisis. It also assigns weights depending on each parameter’s importance. In the hot state of a crisis, it provides scaffolding to inform a decision and anchor us to the “cold state” analysis of what needs to be considered in a response.
2. Invest in psychological safety
The key to the best decisions is having all possible options on the table and a robust assessment of each one’s strengths and weaknesses.
Without psychological safety – the belief that it’s safe to speak up, ask questions or make mistakes – you make it less likely that people share ideas and debate them, and more likely that the group will fall in line with the hierarchy. In the case of the Colombia disaster, NASA identified the absence of psychological safety as a reason why the engineer failed to raise concerns about the risk that caused the shuttle to disintegrate on entering the atmosphere.
Psychological safety must be nurtured – it requires investment before a crisis. Leaders can do this by admitting errors, praising people who challenge or question their views and not punishing people for mistakes.
3. Use red teams to stress test your plans
Groupthink occurs when group dynamics mean assumptions aren’t tested, focus shifts to consensus or reinforces a more extreme position and overconfidence sets in.[2] And, high stakes situations have key elements causing group think: urgency, hierarchical decision making and cohesion created by forming a response team.[3]
The Covid Inquiry highlighted that a significant reason for the UK’s failure to adequately prepare and respond to COVID-19 was groupthink. A consensus was formed – and not challenged – that preparing for an influenza pandemic was sufficient, whilst consideration of socio-economic impacts of the response was insufficient.
One way to counteract this is to set up a red team. Red teaming is often used in cybersecurity and military planning and ensures that critical thinking is being deployed at every step. The red team is given responsibility for interrogating a decision – questioning all the assumptions, processes and readiness to identify vulnerabilities that can then be addressed. This gives people permission to be critical, encouraging people to speak up and criticise plans.
Why you should invest in debiasing
If you create a culture of psychological safety, develop decision support matrices to structure crisis responses and empower people to interrogate ideas using red teams, you’ll have taken major steps to protect yourself from biases.
When developing a crisis management plan, focusing on this debiasing will ensure you are better set up for success, better prepared to make the best strategic decisions - and may also result in less long-term damage from any crisis.
However, it is also in the aftermath of a crisis that this debiased decision-making will be crucial in the rebuilding of operations and reputations.
Applying a behavioural science lens - whether pre, mid or post a high stakes moment - will give you an edge.