Is the new normal ‘non-normal’?

When all of this is over, we will look back and think about why we ended up at a point where these stark, difficult choices had to be made – and what can we do to avoid these extreme situations in future? Needless to say, decision making under these conditions is hard – but then again, that is what we are all called upon to do: making choices under limited information and ever increasing uncertainty. My focus here is more humble and deliberately narrow – what can we take from this into our professional sphere of influence?

For me, if there is one key lesson to take away from a business lens, it is that we can no longer continue with the fallacy of looking at the world through the lens of a ‘normal distribution’ and our inability to appreciate that we are clearly dealing with an extreme fat-tailed process, which requires us to challenge a few fundamental assumptions underlying our decision making models:

  1. Does the Law of Large numbers (Central Limit Theorem) reflect reality in a complex world? The simplifying assumption that allows data analysts to work with ‘sufficiently’ large samples is no longer true. It is now clear that the extreme (i.e. tail) events determine the mean, and to complicate things further, these events being rare, take a lot of data to show up. And for the same reason, time to take a hard look at metrics like averages, standard deviation and variance based on sample data – they too are of limited use.  
  2. Is the traditional linear least-square regression  method too simplistic? Our default tendency to fit a ‘best fit’ line based on past observations and use that to drive decisions for the future is to say the least, grossly inadequate. The very nature of a fat-tailed process implies that the incidence of extreme events tend to occur in a convex (non-linear) way. And in an increasingly inter-connected world, this rate at which the extreme events occur can also grow at a much faster rate.

And so, the biggest implication of working with this paradigm is that we run not just the risk of underestimating the probability of extreme events, but also take a very naive approach to decision making:

  1. Trapped by the optimism bias: we tend to focus on data that makes the situation look rosier than it really might be. For a bank, so long as the loan payments are rolling in, everything seems peachy. Nobody bothers to look at the underlying structural risk factors within the customer base that might actually presage a dramatic change in an extreme situation. In just a span of 4 weeks, the risk profile for large loan portfolios in banks has changed dramatically. To be sure, nobody could have predicted this – but what if the banks were also monitoring the risk at an individual customer level using secondary factors like say, sectoral employment stability; nature of employment? And more importantly, constantly refining the risk models as and when new information becomes available. 
  2. A linear approach to risk: we tend to believe that the current level of risk is a good harbinger for the future. This is not just a matter of convenience – it also gives us the space and room to innovate and take chances. And so a bank may choose to take a chance with an additional credit product to its current customers using the acceptable risk parameters. Is there a case to be made for adopting a non-naive (this is important!) precautionary principle which calibrates the risk profile to reduce the risk of extreme events?

You may say that this is 20/20 hindsight. Of course it is that and also, this is not to say that we should be expecting these scenarios to repeat with some frequency – hopefully, this is once in a generation scenario. But like they say, ‘a crisis is a terrible thing to waste’ – i.e. this is an opportunity to re-think the way we have been working thus far:

  1. Using data is critical as never before: even as you push your teams to adopt a more data driven mindset, take a critical view of all the underlying assumptions; demand more from your Analysts – don’t let them get away with simplistic models and analyses. Now is a good time to take a hard look at the current customer definitions (e.g. segmentation, CLTV models) and risk profiles: do they well and truly reflect the reality of your customers?
  2. Listen to intuition: this may sound counter to the prevailing narrative of Machine Learning (aka unsupervised learning); but the good old fashioned intuition and business hypothesis based on ‘native intelligence’ is going to be critical to your success as never before. Listen to your hypotheses even as you explore ways to extract weak signals from a variety of sources: the right interplay between the two could become key in a world of asymmetric uncertainty (which can make the problems potentially worse, especially as they are convex to uncertainty). Long term winners will be the ones who are able to harness these two strands effectively to build systems and methods to deal with the increasing levels of uncertainty.
  3. A bias for action: do not wait for the big shifts to happen – the reality is that businesses don’t work in the mode of always responding to big strategic shifts, but it is really all about the daily block and tackle in response to signals from the market – from incremental product releases to fine-tuning supply chains to optimizing operations. What the current situation has brought out very starkly is the importance of being very closely connected to your environment and keep learning from the signals. And the best way to do this is to evolve a mindset of incremental decisions, instrument actions, experiment with multiple scenarios and an effective feedback loop to learn and adjust your decisions. Don’t be shy of making a series of quick decisions – as we are painfully learning now, it is much less expensive to make early decisions fast than to wait for things to fall apart. In other words, an aggressive “action based precautionary principle” (paradoxical though it may sound)

Needless to say, the speed premium becomes even more important in this new normal. 

Over the next few blogs, I will try to explore some of the underlying assumptions that have been dominating the data science narrative thus far: has the time come to challenge some of the most basic ones in this now non-normal (sic!) world?

A big hat tip to Nassim Taleb and his work in general – he has always been a huge inspiration. And even more relevant in the current situation. Lots of his material on the internet.

One thought on “Is the new normal ‘non-normal’?

Add yours

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.

Up ↑

%d bloggers like this: