Historians will debate whether the American experience in Afghanistan had to end the way it did. But the suicide attack in Kabul August 25 was a bit more predictable. The Biden administration accurately warned just days earlier that the US-led attempts to get as many Americans and allied Afghans as possible out of the country might be the target of terror attacks. Unfortunately, they were prescient. (At least one former client-colleague was among those trying to get people to safety – our thoughts are with all of those at risk.)
But almost the entire twenty years of US involvement in Afghanistan that led up to this fraught exit can be seen as a failure to see the future coming – in a way that is different from many failures of anticipation we see.
In a more classic type of error, which could be called the “Humean fallacy,” leaders and experts assume that the future will be like the past. COVID has been an example: “expert opinion,” based on previous pandemics, has been confounded over and over the past year and a half:
- In early 2020, experts (based on past pandemic experience) said schools would be a “super-spreaders.” So children were kept home, and “virtual learning” disrupted the lives of tens of millions.
- Later, when viral spread from children failed to materialize, experts decided that keeping kids home had many deleterious effects, and they should go back to school.
- But now the Delta variant has arisen, and children seem more susceptible to it, and suddenly schools are feared to be just the sort of super-spreader sites they had been thought to be in early 2020.
Parents and school administrators are now embroiled in arguments over masks, attendance, and everything else; “expert opinion,” having appeared to mislead multiple times, inspires little more than scorn among a large portion of the population.
Vacuous numbers, elusive strategy and failures of imagination
But at least the COVID authorities were trying to use experience to guide their efforts. In Afghanistan, a grosser sort of error predominated: pure faith in numbers.
Spreadsheets and Power Point briefings were used to convince policymakers that progress was being made, as though the “Five O’Clock Follies” of Vietnam had never occurred. Reports of numbers of army and police personnel trained were assumed to mean Afghanistan was more stable and self-sufficient. Mere spending was made a key objective. In the words of the Special Inspector General for Afghanistan Reconstruction, “Money spent, not impact achieved, became the primary metric of success.” Eventually, spending was so astronomical ($978 billion, by one count) that those making policy could not conceive that it was almost entirely wasted. Even “worst-case scenarios” failed to anticipate reality. How could 300,699 Afghan soldiers and police fail to keep the Taliban at bay for six months, or at least 90 days?
Those numbers, ginned up from billions spent, or out of next to nothing at all, acquired a talismanic power. The 300,000 number was repeated by President Biden in July to guarantee the Taliban could not take over anytime soon. But it turned out that far more than half the Afghan Army were “ghost soldiers,” drawing paychecks that went into the bank accounts of corrupt officers and officials. And many of their leaders had already been quietly negotiating surrender terms with the Taliban in anticipation of the announced American departure.
What was lacking in the American handling of Afghanistan was not numerical analysis. It was strategy – “The U.S. government continuously struggled to develop and implement a coherent strategy for what it hoped to achieve” – and imagination. No one in a position of power stopped to imagine a variety of qualitatively different – and plausible – endpoints for the US involvement there, and working back to the present to imagine how they might come about. One, obviously, would be the collapse of the Afghan government and a swift Taliban takeover. One might be the collapse of the country into fiefdoms run by rival warlords. Another could have been a redoubled US effort in the country due to some sort of conflict with a neighboring state (Iran? China? Pakistan?). Strategies could have been made for each; none were.
The problem with these putative futures for numbers-obsessed planners was, it seemed that they could not be analyzed, because they had not happened yet. Analysis requires data; and all data is about the past. But the future doesn’t care about numbers. FSG scenario planners believe that fundamental future change can only be anticipated using imagination. Analysis without imagination is a guarantee of strategic surprise.
One lesson from Afghanistan and COVID: Better imagination without analysis than the other way around.
There is a difference between someone being responsible and someone hiring a team of experts. Johnson’s War on Poverty was in fact the Social Scientist Full Employment Act; and changes in the tax laws always stimulate the economy by increasing consumer and business spending on lawyers and accountants. The classic analysts solution of breaking every big problem into an endless number of small manageable tasks is today’s equivalent of Zeno’s paradox and indeed the US Department of Defense Achilles was never able to catch up with the Taliban tortoise any more than the CDC has been unable to overcome Covid-19.
Good point. Breaking a large problem into smaller problems to be solved, and proceeding to solve those without constantly making sure you’ve defined the “parent problem” correctly, is, of course, the antithesis of strategic thinking. A wide array of plausible alternative future scenarios, each addressing issues that have a great impact on one’s world of work, can help with problem definition, and lessen the chance that you’ll immediately sprint out to the end of one flimsy tree limb when it’s the whole forest you need to be worried about. One of our military clients once told us: “Our officers would rather do than think.” I think everyone would rather do than think, personally. Thinking is much harder; doing (at least, accomplishing a task that has been, e.g., pre-defined for you as part of a giant DOD/DOS plan for reconstruction) allows you to turn the disorderly right part of your brain off and get down to measurable chunks of work. Well-defined sub-tasks are great, but you have to do the problem definition correctly up front, and that means using your right-brain imagination for a little longer than you want to or is “appropriate” for an adult “doer,” and building in the sort of flexible contingency planning that acknowledges the irreducible uncertainty in any human endeavor (I believe one of my colleagues will be addressing this issue in regard to Afghanistan shortly). In sum, you can’t start your rigorous left-brain analysis until you have something to analyze… and that requires a right brain that is capable of a different sort of rigor.