A new book is out by the journalist Tom Chivers, author of The Rationalist’s Guide to the Galaxy and How to Read Numbers. Kirkus Reviews calls it “An ingenious introduction to the mathematics of rational thinking.” The Wall Street Journal likes it. Oliver Burkeman wrote, “Life is shot through with uncertainty, but in this fascinating, witty and perspective-shifting book, Tom Chivers shows why this needn’t condemn us to powerlessness and panic.”
The book is entitled Everything Is Predictable. Spoiler alert: it’s not.
Oh, everything is predictable in the sense that, say, you can shoot at everything in the universe with a Nerf gun. That, of course, does not mean that you can HIT, say, the planet Saturn with a sponge dart shot from, say, Akron. Just as you can “predict” anything you want, but there is a huge sphere of uncertainty for which prediction can never actually give you useful guidance.
We scenario-planning types are not engaged in prediction at all. We are engaged in identifying and describing multiple plausible alternative future outcomes, and helping our clients think deeply about them. We do this for several reasons.
First, we are interested in strategic decisions. If you are not going to make different choices based on a prediction, then it is not strategic. In fact, things that are actually predictable – astronomical events, physical, chemical, and some biological processes, increasingly the weather, actuarial outcomes for large populations, and to some degree demographics – tend not to be strategic. Predictability is a quality of phenomena that are well-understood by almost everyone, and hence not a source of competitive advantage to anyone.
Chivers says the following near the beginning of this book:
“Life isn’t chess, a game of perfect information, one that in theory can be ‘solved.’ It’s poker, a game where you’re trying to make the best decisions using the limited information you have…. …Any decision-making process, anything that, however imperfectly, tries to manipulate the world in order to achieve some goal, [e.g.] governments trying to achieve economic growth: if it’s doing a good job, it’s being Bayesian.”
But life is NOT poker. Unless it’s a form of poker in which when you lay down three cards, the dealer hands you a lobster, a spatula, and a pine-tree-shaped car air freshener, and the card table gets turned over or flooded on a regular basis. Knowing the odds of a poker game is a lot easier and less useful than merely anticipating that things like that could happen, and bringing a raincoat and a lobster pot.
When you try to predict things using Bayesian approaches, you are starting with “priors,” that is, your preliminary estimate of the probability of a certain outcome. And that requires some estimate based on the past – crudely, “data.” And as new data – also from the (more recent) past – comes in, you alter your “priors.”
And yet all data is about the past – NOT THE FUTURE. Any predictive system assuming that the past is a reliable guide to the future will work just fine – until the lobsters and floods happen, and it suddenly does not work anymore. Human beings, however, have imagination, which can conjure up entirely new circumstances and events that have never happened before.
Rigorous imagination – the approach we use to writing scenarios of the future for clients – is not imprisoned within a jail of past on-line data. Nor is it concerned with probabilities, at all; only with impacts on the client’s business.
Everything Is Predictable is an engaging and interesting history and exposition of how Bayesianism is critical in certain scientific, medical, and other specific applications. Bayes helps to show us non-intuitive results in areas such as vaccine efficacy, cancer screening, and the like.
But it’s pointless even for issues such as one Chivers himself brings up at the beginning of the book: the probability of Russia invading Ukraine. He thinks the problem with it is “[W]hat’s your base rate? The average number of land wars in Europe per year? The average number of Russian invasions of Ukraine per year? It’s a subtle art—picking an appropriate reference class to compare your example with.”
Preparing for Multiple Outcomes
But that’s not the real problem. The real problem is, why predict this numerically at all? There are (were) two basic outcomes, and we can’t get anywhere near 0% or 100% probability, so we’re going to have to prepare for either outcome, no matter what.
So for strategic decision-making, this Bayesian approach is almost completely useless. And in that respect, Everything Is Predictable is a worthy addition to the Canon of the Cult of Prediction, which is extensively documented in my forthcoming book, Fatal Certainty: How a Cult of Prediction Made the 21st Century an Era of Strategic Shock, and How Rigorous Imagination Could Bring Us Back.
Question: Does Tom Chivers deal at all with strategic uncertainty and the problems you bring up in your thoughtful (and amusing) essay?
Another key point is that Chivers says flat-out, not just in the title, that “everything is predictable.” He says early on, “When we make decisions about things that are uncertain –
which we do all the time – the extent to which we are doing that
well is described by Bayes’ theorem. Any decision-making process,
anything that, however imperfectly, tries to manipulate the world
in order to achieve some goal, whether that’s a bacterium seeking
higher glucose concentrations, genes trying to pass copies of them- selves through generations, or governments trying to achieve economic growth: if it’s doing a good job, it’s being Bayesian.”
It is true that adjusting one’s outlook as the result of new facts is rational, if you are still in the game after they are manifest. But if we are facing a decision that is momentous, by the time the facts change, and we can adjust our outlook, it is often far too late. Bayesianism ONLY takes into account what has already happened – i.e., the past. It cannot anticipate a fundamental change in the game. Those sorts of strategic changes can only be IMAGINED ahead of time, never predicted.
He skirts them. For example, he brings up the “Will Russia invade Ukraine?” question, which is clearly now moot (and was long before publication of Everything Is Predictable . He merely points out what types of “priors” a forecaster would [have?] need[ed?] to predict that event. He has a “Superforecaster,” Jon Kitson, say that you could start with the frequency of land wars in Europe since 1945, which would be about once in twenty years. Then you alter your “priors” with new information. Kitson says, “By December 2021 I was at about 60 percent likely that there’d be a war, and was up to up to 80% by mid-January,” that is, just before the war. Presumably he’d expended great effort altering these estimates day by day. And someone, maybe in the U.S. government, was maybe even paying him to do this. Any such money would, in my opinion, be wasted. Russia had invaded Ukraine in 2014 already. It invaded Georgia in 2008. So we already knew that a Russian invasion of Ukraine was quite plausible. Altering the percentage estimates of its likelihood daily might have been helpful in convincing Zelensky to take the possibility seriously – except that this exact approach did not even work to convince Zelensky that invasion was imminent.
But the more basic point is, when you have a situation in which there are two basic outcomes, and either outcome has a non-trivial chance of occurring, further refinement of percentages is not a good investment. What is a good investment? IMAGINING how each of the two basic outcomes (and a lot of other ones) might evolve, and developing STRATEGIES to deal with them – BEFORE they occur.
There really are not any other examples of what I would call strategic decisions in this book. There’s a lot about how the brain works; there’s a lot about certain types of common mistakes humans make in very well-defined choice situations that can be objectively measured afterward. But the sort of complex, multivariate, hard-to-measure judgment calls that top managers have to make all the time? No. Even in the case of Ukraine, was Bayesian forecasting helping the U.S. decide how to act with respect to the crisis? Whether to arm the Ukrainians? How far to go in provoking Russia? Sanctions? Alliances? NATO? For each of these issues, you need imagination, not prediction. Because each of them can have a wide variety of plausible positive and negative outcomes, and we would have to be prepared for each of them.
Don’t get me wrong, this is an engaging book about a rather interesting and arcane method of forecasting that has a vital role in understanding certain non-intuitive results of, say, medical testing results (especially the dangers of false positives). It’s great in examining individuals’ choices in game-type situations. But for strategic decision-making about complicated, one-off, bet-the-company-or-country choices under conditions of radical uncertainty, Bayesian forecasting amusing but useless at best, and positively misleading at worst.
But there is a 16.27% chance I’m wrong about that.
Perhaps it was predictable that just as you finish a writing a book warning of ‘the cult of prediction’, up pops one called ‘everything is predictable’. I asked AI to predict my next word – it suggested ‘The’ and continued ‘The irony is palpable’. As far as predictions go, that’s not bad.
“You don’t know what ‘irony’ is, do you, Baldrick?”
“Sure I do. It’s like goldy or bronze-y, only it’s made of iron.”