Last week’s Economist had a dire title ‘The next catastrophe’1. They call economics a dismal science for a reason – when the world is in the middle of a health and an economic catastrophe, the Economist chose to talk about the next catastrophe. They do have a point: if there is any time to draw the collective attention to ‘what could go horribly wrong’ scenarios, this is probably as good as it gets.
This also begs 2 questions: why is it that systems (economies, organizations) don’t do nearly enough to plan against extreme events? And more importantly, can systems even do anything to protect from such extreme events?
Why don’t we plan for extreme events?
Actually, turns out that there is a good evolutionary reason. If we just sat around trying to plan for every conceivable disaster, highly doubtful if we would have progressed as a species and gotten this far. Of the several cognitive biases that drive behavior, especially at a system level, I think two stand out (there will be several more):
- Comparative optimism bias: The tendency to believe that we are less at risk of experiencing a negative event than others. This is especially common in organizations, where managers routinely believe that they are less likely to be affected by an external event than their competition
- Availability heuristic: The tendency to believe that past events are a good predictor for future events. Almost all forecasting systems in organizations are built on that principle. If the extreme event hasn’t occurred in the recent past, there is no reason to believe that it is around the corner.
Until of course, an extreme event comes crashing in. Once the organization struggles its way through the crisis, there are two broad ways the organizations react – the best case being a change in policies to make them more resistant to similar events (e.g. stress testing for banks after the 2008 crisis). But in most cases, nothing substantive happens – and the crisis fades away from the collective memory.
What are some of the cognitive repairs?
I think most organizations are coming around to the fact that extreme events are here to stay – and if anything, we are going to see an increase in their frequency. And while organizations try to figure out structural solutions (e.g. Risk management needs a complete re-think), what I want to do is to offer up a few ‘cognitive repairs’ that organizations would do well to incorporate in their decision making processes
#1: Learn to distinguish between events and the exposure to events – this is a common problem that is often difficult to get past. Too often, analysts end up trying to decipher the event itself as opposed to trying to understand the impact of the event. For instance – trying to predict the risk of customer default in the immediate aftermath of the current economic downturn is much harder than estimating say, customer sensitivity to external shocks (using secondary indicators like payment delays, re-financing requests etc.) I am reminded of Kahneman-Tversky’s prospect theory2, where they famously pointed out that people are more sensitivity to the utility of changes in wealth than wealth itself. See below – it is evidently more important (and useful) to understand the distribution of the f(x) than the underlying factor.

#2: Think in ranges, not specifics – Most managers are tempted to seek specific answers, and paradoxically, even more so when there is high uncertainty. This is rooted in our simplified ‘binary’ view of outcomes, while in many cases, the actual outcome could have a different distribution (see below)

#3: Think in Systems – More than ever before, it is becoming difficult to apply the simplistic cause and effect thinking that has been driving data analysis for years now (blame the econometrics mindset of regression models). This becomes especially true when you are tasked with system level phenomena, say the risk of the next catastrophe that is likely to hit your business. Like I mentioned a few weeks ago, physicists have been working with Simulation models to crack similar problems for several years now. Now is a good time to invest in that capability: System Dynamics is an area of interest, especially when you are trying to model the impact of events (and decisions) at an overall system level. More on that next week.
This much is certain: organizations will need to build the capability to get better at estimating the impact of unforeseen events. To lean on the economists, Knightian uncertainty is a term used to “describe the lack of quantifiable knowledge about some possible occurrence, as opposed to the presence of quantifiable risk”. In other words, while we all must accept some fundamental degree of ignorance, it is just as important to constantly strive to bridge the gap by creating decision support tools that help in preparing for the next big crisis, when (not if) it hits us.
Further Reading:
- https://www.economist.com/leaders/2020/06/25/politicians-ignore-far-out-risks-they-need-to-up-their-game
- https://en.wikipedia.org/wiki/Prospect_theory As I have said earlier, I am huge admirer of Tversky and Kahneman. Do take a look at ‘The Undoing Project’ by Michael Lewis. Highly recommended
- https://en.wikipedia.org/wiki/Knightian_uncertainty Economics is full of some highly underrated thinkers. I would put Frank Knight in that category – unfortunately, the term ‘unknown unknowns’ has become something of a joke for political doublespeak, thanks to Donald Rumsfeld
Well written. Check out How to measure anything by Douglas Hubbard. Also, check out the FAIR risk model to quantify cyber risk.
LikeLike