The Problem with Experts' Predictions
As our hyper-connected world becomes increasingly complex and fast-paced, we tend to rely more and more on experts’ advice to get a sense of what tomorrow might bring. Does this matter? Enormously, because, most of the time, experts’ opinion have the perverse effect of confusing or misleading us, instead of enlightening us!
Over the past few years, a plethora of research papers has shown that expertise is highly overrated. A recent article published by a group of US academics goes further, arguing that the stronger his opinion, the less the expert actually knows. In the same vein, Philip Tetlock (an American political scientist) convincingly demonstrated a few years ago in a book now famous (Expert Political Judgement) that an inverse correlation tends to exist between celebrity and accuracy; generally, the more famous experts (those we tend to listen to) (do worse) get it wrong more often than the unknown ones.
The conclusion? We should be wary of experts’ predictions, even more so when they come from the most famous and authoritative ones. We all have in mind, for example, the extent to which almost every pundit got it wrong about the Eurozone: from 2009 to mid-2012, the overwhelming conviction that the Eurozone would disintegrate – with at least, one country exiting - did prevail. This was of course reflected in the market consensus, and proved wrong.
This phenomenon of “strong opinions and wrong predictions” tends to be reinforced and amplified by the media. That might have to do with the “selection bias” they almost systematically engage in by favouring celebrities with strong “black and white” points of view and extreme opinions. In the absence of a mechanism that pinpoints and clears out bad pundits or bad opinions, being occasionally (or even consistently) wrong is not the problem it should be. One step forward might be to establish a global registrar that monitors experts’ opinions, calls and predictions in a particular field, but such a registrar does not exist yet.
What to do? In my latest book Disequilibrium, I suggest a few simple “rules” to deal with the problem of experts’ convictions, and beyond this, greater uncertainty.
First of all, don’t try to predict what cannot be predicted! As much as we long for the intellectual comfort derived from linear projections, strong discontinuities are a fact of life. Even though entire segments of the financial, media and publishing industries thrive on providing forecasts to an apparently sophisticated group of professionals (traders, bankers, economists, and so on), we all know that there are things that cannot be predicted, let alone forecast (the short-term price of commodities or currencies, the future, etc.). If we can’t predict the future, we can at least anticipate it! It’s important to remember that big surprises are almost always preceded by a trail of early-warning signals, however faint they may be.
Differentiate what you don’t know from what you can’t know. To not fool ourselves, it is important to understand and dissociate what we don’t know (but could know) from what we cannot know (even if we wanted to). To help with this, we can think about risks as a continuum and suggest a simple topology called the KuU framework: moving from known risks (K), through unknown (u) to unknowable (U). Known risks can be measured and managed (because they can be priced); their causes, probability of occurrence and likely impacts are understood and well defined (the most obvious example being natural disasters). Unknown risk events are well defined, but it is not possible to assign probabilities to them of specific events occurring (terrorism and systemic financial risks are good examples of unknown risks). Therefore, they cannot be priced. Unknowable risks have not yet emerged. “Unknowability” is a key consideration in today’s interdependent world where a large number of possible combinations of risks and vulnerabilities can lead to a vast array of possible outcomes, some of which are “perfect storms”.
Don’t try to compute the average. We all fall prey to this tendency that consists in relying on a number that underpins an expert’s conviction. Even worse, in situations of fundamental uncertainty, experts tend to assign subjective probabilities to the likelihood that a particular event may occur (the functional equivalent of tossing a coin). For reasons related to power law versus Gaussian distributions, these are more often than not wrong where complex, adaptive, non-linear systems are concerned. Computing the average gives a false sense of security, makes us feel good, but is totally and utterly misleading. Averages just lead us to underestimate risk in the face of uncertainty. Similarly, don’t pay too much attention to mean-reversion, which only “works” when conditions remain constant.
Don’t try to predict the catalyst, try instead to identify areas of vulnerability. This is an essential piece of advice that Nassim Taleb gives on how to build more robustness in any given system. Predicting the catalyst is foolish because one simply cannot know which one it will be. The financial markets, in particular, are obsessed with the search for “The Catalyst”, always looking for the Holy Grail that might trigger a reversal or a major trend in the markets (a surge in a particular bond yield, a flight to safety to one distinct market, an increase or decrease in a commodity price, and so on). This is in vain: in complex, nonlinear, adaptive systems such as the global economy or financial markets, an event always has more than one cause, all of which are intertwined in a web of complex interrelationships. In such conditions, it therefore makes much more sense “to study the terrain” and identify the most vulnerable, fragile, areas, than to focus on the illusory catalyst.
Listen to contrarian opinions! This is what some of the world’s most successful fund managers do. As this year’s Berkshire annual meeting, for example, Warren Buffet invited Doug Kass, who’d taken a short position against Berkshire, to take part in the meeting, inviting everybody to “listen to those who insist you’re wrong”!