By Published On: 28 July 2017Categories: Probabilistic Analysis, Scenario Planning

However interesting and generally appealing scenario thinking, as discussed in my previous article, may be, it only provides a broad, qualitative picture of the context within which one is considering a significant investment decision. But indeed, a very important picture, even with all its caveats. Yet, we need numbers when it comes to sinking capital. No decision executive will sign off on an investment proposal that only consists of a fascinating narrative. She or he will want to see some underpinning of the potential profitability and capital efficiency. And some what ifs. In other words: we need quantification, we need a cash flow outlook of some sort.

In fact, it is one of the more difficult, and therefore avoided, challenges: to combine qualitative with quantitative thinking. Many restrict themselves to one of the two worlds. Yet I believe one has to try to cross the border. Quantification alone can often not tell the story by itself, but only qualitative narratives have the danger of becoming arm-waving.

But this quantification, if done well, does not come easy. How can we put numbers on the future? One way is to assume that whatever was going on in the past will replicate itself going forward. For some variables that is quite doable. If you are investing in wind or solar energy, the wind speed patterns and annual numbers of sun hours at a particular location can reasonably be inferred from historic records, with some averaging over multiple years. And the number of days in a year we can predict quite accurately! :) However, for many variables such an approach would not be credible.

Nevertheless, this is exactly what was (and is) common across many sectors. In fact, it is well known that this practice was one of the main causes of the financial crisis in 2008. The notion of ?risk? was strongly related to the degree of stock volatility in prior periods. Risk models were used that were only driven by historic behaviour; there was no forward looking element (see NYT, 2009). In essence, the quantitative models made people stop thinking. For the calculation of the minimum coverage for Dutch pension funds (i.e. stretching out multiple decades) the use of the current interest rate is mandatory. Similarly for the oil and gas reserves calculations for the SEC (US Securities and Exchange Commission) it is required to use, for the purpose of asset valuation, the current oil price for several decades of remaining production. Of course these last two examples have auditing and legal dimensions, but they do illustrate the dilemma when considering future cash flows: we know that the value that some important variable assumes today may not be representative for the future, but we (think we) have nothing else (that is auditable).

For investment decisions and strategy development we will often need both a scenario approach and quantitative analysis. It will depend on the investment project where the emphasis lies, but I would argue that for major capital decisions or strategies both should have equal weight. What we see in practice is that (if done at all) scenario work and quantitative investment evaluations are poorly linked, perhaps because these different approaches are serviced by different departments, they require different styles of working, different areas of expertise.

And of course, to come back to the earlier point, the quantification developed should use historic data as a basis, but needs also to include a forward looking, judgemental dimension. Scenario narratives and (probabilistic and judgementally forward looking) quantification should go hand in hand, in some way, for maximum understanding and clarity.

For quantitative analysis we require a choice of models and methods, fit for purpose, not too complicated, but yielding consistent results. And we need mathematics, to do things smartly, quickly and consistently. To deal with uncertainty in the numbers, we need probability theory and probability distributions. There is no way we can arrive at exact estimates of all future variables (costs, prices, schedules, sales quantities, tax rates, etc.). But what we can do is to try to estimate a range (under certain scenario assumptions).

One of the most important distributions for this purpose is the lognormal distribution. It is representative of multiplicative processes (as is the normal distribution for additive processes). This means that the product of variables with ranges of uncertainty will tend to be lognormally distributed. Examples are volumes (L*W*H) and revenues (number of units solds * price). The lognormal distribution is also used quite a bit to model share price behaviour. It has an elegant mathematical formulation and allows for modelling upsides and downsides (with a trick). A recent article I came across (Surovtsev, D. and Sungurov, A. ?Vaguely Right or Precisely Wrong??: Making Probabilistic Cost, Time and Performance Estimates for Bluefield Appraisal. SPE 181904. SPE Economics & Management Journal, July 2017.)?confirms that most cost, schedule and production variables are best represented by such a distribution.

Rather than (only) relying on integrated systems and black-box like simulation software, for practitioners and analysts it may be useful to get into the guts of the lognormal distribution, understand its mathematical articulation and have some practical formulas to calculate things by hand. It gets a bit nerdy, but do check out a useful article for this purpose in our free knowledge base?(in which more articles are to come).

A next blogpost will again be about global scenarios in a qualitative discussion: what can we learn from our predecessors?

 

 

Recent Posts