The article states: “Wall Street has a notoriously bad forecasting record. It almost always predicts that the economy will grow by something like 3 percent a year, which happens to be correct most of the time… Amazingly enough, Wall Street’s consensus forecast has failed to predict a single recession in the past 30 years.”
Amazingly enough? Not really. One of the problems that Wall Street (and perhaps the majority of other forecasters) has to deal with is that their audience expects (and often demands) a single point forecast; many investors and decision-makers seem to prefer the false security attached to a static forecast.
When it comes to defending Wall Street, don’t count me in. However, it is unavoidable that if one is required to present a single static forecast, then the presentation of what one considers to be the most likely is the only logical (apparently non-biased) response. With a recession occurring on average say once every 10 years, such scenarios will simply be excluded from a static forecast. Of course, risk modelling would allow the possibility and consequences of a recession to be included (thereby giving a lower or less optimistic estimate of base case earnings – say the average, rather than the most likely).
What is “consensus”? The range of analyst estimates making up a consensus forecast is therefore something close to the distribution of the mode of the outcome (rather than the distribution of the outcome).
Such biases in static forecasting are unavoidable, and will only disappear when the clients of such forecasters are willing (or demand) a proper uncertainty analysis around forecasts, something that is happening more and more, but still has a long way to go. Of course, the use of @RISK (risk analysis using Monte Carlo simulation) and the DecisionTools Suite (complete toolkit for decision making under uncertainty) provide an ideal way to implement risk analysis within the Excel environment that is typically used for forecasting.
Dr. Michael Rees
Director of Training and Consulting