In the light of Costa Concordia, Scandinavian Star or, for that matter the Titanic, it is interesting to see what is and will be written about a disaster (long) after the dust settles. We usually empathise with the casualties and with the survivors, praising the bravery of all who helped or even rescued people against all odds. We then go on wondering what actually has happened and what we can learn from it, hoping to make our world a tiny little bit safer in the process. And at some point, invariably, comes the blame question; we want someone to take the fall for it.

Why is that? Why do we want to blame someone for the mishap? Well, that’s relatively easy: we want to know that someone made a misjudgement, a failure, a mistake. Because the ramifications, if no one did, are that in our world random mishaps can happen and that makes our understanding of the world we live in less secure, less safe.

But is that blame always justified, and more importantly can we prevent being blamed?

To answer the former there is an area of research in psychology that involves deciding whether someone who commits an (immoral) act is to blame. Now let’s say that two persons both throw a large stone off a viaduct at cars passing underneath (there was a series of these incidents not long ago in The Netherlands). One person’s stone lands on the side of the road, without doing any harm. The other person’s stone, however, hits the windscreen of a passing car, resulting in a serious accident. Let’s say that both persons had the same intention, so the person whose stone did not strike a car is just as blameworthy as the other one. Moral luck is the belief that you should hold someone to blame only if the action causes harm to others, not what the intention was. You would therefore blame the person that caused an accident more than the other, after all there was no mishap, right?

So basically, keep your nose clean, stay out of any disastrous situation and you will be fine.

Ok, that might prove not to be so easy. The world of shipping has been evolving since say, the dawn and doom of the Titanic. Vessels have become larger, more complex and the sheer number of people they can transport have increased tenfold. We rely more and more on complex, technical systems and we fancy ourselves safe in the arms of modern engineering. If you care for some very old-fashioned reading you might try Joseph Conrad’s “Some Reflections on the Loss of the Titanic”[1] (1912). The points he makes are still very true today. He also points out that with the growth of the size and complexity of the vessels comes the responsibility to be able to deal with the result of this added size and complexity. The question however is: can we deal with this?

Michael Grey[2] states that it becomes increasingly difficult to train for emergencies, let alone crises.

“Throughout your career, aboard every ship you have sailed in, you have undertaken drill after drill, dutifully practising mustering passengers, regularly lowering lifeboats, even undertaking “desktop” damage control exercises. In more recent years, you will have spent time on simulators. But in the back of your mind, you may well have thought, as you watched the passengers being instructed in the use of their lifejackets, or the boats sent away on a smooth harbour exercise, that such drills barely scratched the surface of what “could”, just possibly, happen.”

He does have a valid point: how to train for calamities? But he is mixing up emergencies and crises, just as most people do. Captain Majid Safahani Langroudi[3] points out the difference between an emergency and a crisis. An emergency being a known event:  something we have a procedure for. A crisis is then an unknown event, sometimes a situation we did not foresee, other times an unlucky combination of emergencies, so that our procedures are not useful. Most of the time we train for emergencies: being on a bridge simulator, dealing with navigational emergencies, conducting table-top exercises. We, however, rarely train for a crisis.

So, if we don’t prepare ourselves for a crisis or cannot prepare ourselves well enough for an emergency are we then to blame? Are we just as blameworthy when nothing happens as when something happens? Of course not, there is no malintent, we try to do the utmost best in keeping our passengers, crew and ourselves safe. But wouldn’t it be the safest course to sail, the one that tries to minimizes the effects of a crisis or emergency? The one that prevents the hindsighters from criticizing you and your team? And you know what: the right kind of training gives you exactly that: hindsight. And since hindsight is always 20/20 you might actually learn a thing or two.


This blog is written by Arjan de Pauw Gerlings


[1]  Joseph Conrad 1912,  “Some Reflections on the Loss of the Titanic

[2] Michael Grey 2017, “Taking the blame

[3] Majid Langroudi 2010, “Enhance Maritime Safety by Distinguishing Between Emergency Situations and Crisis Situations


2 thoughts on “Hindsight is always 20/20

  1. As is stated following an incident the finger of blame is always pointed however, in order to improve our understanding of failed sociotechnical systems, we must dissect the event with the aim of preventing repetition of mistakes, findings need to be understood and shared. If events are reviewed and outcomes are communicated across sectors and industries it may be possible to prevent further disasters of a similar nature, after all no two incidents are identical. Ultimately it is human interaction that will cause system failure whether that be from design, manufacture or misuse if the responsible person is not identified then we cannot prevent the repeat.


    1. Thank you for your thougths on the subject, Chris. I do agree that we should aim to prevent disasters or at least try to contain their possible consequences. I also agree that no incident is identical to another.
      That is why I think it is a good idea to train, very specifically, on the way to limit the consequences of given crises.
      I strongly believe that there will always be incidents, the human mind is both briljant and flawed and with technical advances and the complexity of our world I would say it is inevitable that mistakes will be made. Those mistakes can lead to incidents, but are incidents also emergencies or even crises? I don’t think so. I think incidents can lead to emergencies if unchecked and given enough complexity to crises. We need ways of looking at situations, certain skills and attitudes in order to prevent incidents to evolve into emergencies or crises. We need certain structures to apply to very divers situations and then, maybe, we can prevent things from going from bad to worse.
      And the way we humans learn means that although it starts with communication, we need to experience what the effects of our actions are. Enter: simulations.
      What do you think , Chris? Is reading about incidents enough or can we do more?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.