Our brain is perfect for crisis management, or is it?

Do you also marvel at the human mind? That grey soup where every thought and every well thought-thru action comes from? I do. I think the human brain is the single most complex system we have by far. Even thinking about it requires itself. And although an amazing array of small wonders spring from our brains it is also, I hate to say, … euhmm … flawed.

There, I said it.

The human mind is a very powerful tool in handling crises: our brain collects information, perceives the short-fall of information, it processes the information and with that you can start to analyse what this will mean for your organisation in the (near) future. The mind weighs up the possibilities and starts to produce plans and actions to take. But, does that go flawlessly? Ermm … No.

Since our brain is so complex and because it has to deal with an enormous amount of information it does have its flaws. In psychology we call them biases [1]. It basically means that our mind has developed strategies to deal with complex problems in a complex world with masses of information to process.

We have the most wonderful biases: The Cheerleader effect, a cognitive bias which causes people to think individuals are more attractive when they are in a group. Or the Decoy effect, a preference for either option A or B changes in favour of option B when option C is presented, which is very similar to option B but is in no means a better option. Or what to think of the IKEA effect, people place a disproportionately high value on objects that they (partially) assembled themselves, like furniture from IKEA, regardless of the quality of the end result.

But regarding a crisis we should be very aware of other, less innocent, biases. The Normalcy Bias, for example: The refusal to plan for, or react to, a disaster which has never happened before. This means that it is hard for us to start preparing for emergencies, since they have not happened to us (yet). Rendering us virtually clueless when faced with crises.

Or what to think of the Outcome Bias: The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made. So when we manage to blunder our way through an emergency and the outcome is, by chance, not negative, we think that we are masters of crisis management.

During an emergency we can also deviate from the straight and narrow: Automation Bias: we can depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions.

Or: The Framing Effect: we can sometimes draw different conclusions from the same information, depending on how that information is presented. This drives home the point that we should be very aware of how we present information to others in the team.

Neglect of Probability: the tendency to completely disregard probability when making a decision under uncertainty.

Confirmation Bias: The tendency to search for or interpret information in a way that confirms our preconceptions. We may even discredit information that does not support our views. This actually means that once we have an idea about the situation in our head it’s hard to change that view even when we have all the information we need to do so.

These mechanisms work in your head, regardless of the fact that you are aware of the existence of these biases or not. We obviously are in need of some strategies to counter the flaws in our brains. That is exactly why we train people to apply information management or to structure a crisis; they need to structure the information coming in, their own thoughts and the working of their team. If you have some way to step back and look at what you have, what you miss and what the effect of both is on the situation at hand you might just counter the effects of biases yet.

But of course, all these biases do not apply to you, right? You can honestly say that you make good, sound, well-informed decisions based on logic. And you may be absolutely right …

… well, okay then … one last bias:

Bias Blind Spot: The tendency to see oneself as less biased than other people[2].

[1] Peter O. Gray, 2014
[2] Emily Pronin & Matthew B. Kugler, July 2007

One thought on “Our brain is perfect for crisis management, or is it?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s