BUSINESSES ARE often tempted to use pilot projects to deliver a satisfying success, by introducing a new product in an optimum scenario with hand-picked staff and savvy customers. However, the goal of pilot projects is not to attain a pleasing triumph, but to reveal new knowledge. Pilots may be more instructive when the conditions are deliberately challenging, and businesses can discover what does not work.
War games have a similar goal to pilot projects: discovering novel information before launching a larger endeavor. The aim is not success but instead to create original insights. Oftentimes, new knowledge is more likely to arise by exploring negative scenarios of stalemate and loss.
ONE SOLUTION to overconfidence and the planning fallacy is task segmentation, or breaking down a project into constituent parts. Researchers have found that when people are asked to estimate the time required to complete an overall task, they tend to be too optimistic. But when they are asked to estimate the time required to complete each individual subtask, and then sum the parts together, the estimates are more realistic. People are less likely to miscalculate the time needed for short tasks, and working through a checklist of jobs means that no assignment is ignored in the overall appraisal.
Military planners can employ task segmentation to create a more realistic estimate of the operational timeline. For example, planners originally intended to reduce U.S. troop levels in Iraq to thirty thousand by September 2003, just four months after major combat operations ended. If planners had unpacked the process of post-conflict stabilization into separate subtasks and made plausible estimates of the time needed for the completion of each element, the overall estimate might have been more accurate.
Another tool to ward off overconfidence in the business realm is a pre-mortem where actors imagine that a future project has failed and then analyze the most likely causes. As Thomas Gilovich wrote, “The idea is to suppose that your idea bombed. What would you be saying to yourself right now about how or why you should have foreseen it?”
This approach helps officials to clarify exactly what success and failure look like, and how to translate mission goals into measurable metrics. It readies planners to anticipate and detect failure early on. It also empowers skeptics to raise doubts and forces even zealous champions of a policy to think through potential hazards. Simply asking planners to review a strategy for flaws may lead to a half-hearted effort because people are in a tunnel vision mindset and act like partisans of the chosen course of action. By contrast, planners engaged in a pre-mortem step outside of the project, are less emotionally attached to its success and more willing to actively search for problems.
The pre-mortem also encourages officials to plan for potential failure and create a plan B if the campaign goes off the rails. Thinking through contingencies and exit strategies in case of disaster is not defeatist, any more than building a fire escape is defeatist. And if officials have a strategy to mitigate loss in their back pocket, they are better positioned to take risks.
TO COUNTERACT positive illusions, businesses may employ a devil’s advocate, or someone who is tasked with critiquing the prevailing plan of action and finding disconfirming information, before the enterprise is committed to an unsound endeavor. Another approach is a murder board where an entire committee seeks to identify flaws in a plan and “kill” it. The murder board idea originated with the Central Intelligence Agency after World War II, and the Joint Chiefs have also employed this tool to evaluate projects. But murder boards are used inconsistently. For example, the assumptions behind the Iraq invasion plan were not subject to appropriate critical vetting.
These structures will only work in a hierarchical organization like the U.S. military if the devil’s advocate or the murder board are empowered to critique senior officials, who, in turn, are willing to listen. One solution is to bring in outsiders. Researchers have found that the best way to check hubristic CEOs is an independent and active board of directors. In the military, the same role can be played by retired generals or other respected external figures.
Furthermore, since people tend to be realistic when assessing other actor’s projects and overconfident about their own projects, it may be helpful to ask allies to stress test a war plan. Before the invasion of Iraq, for example, Washington could have invited British officials to systematically interrogate the strengths and weaknesses of the U.S. strategy.
OFFICIALS IN the national security community often think about failure in biased and dangerous ways. Leaders fixate on past loss when learning and neglect future loss when planning. Of course, there is variation among different actors; for example, civilian leaders in the White House may be more sensitive to domestic political pressures. But the broad pattern of biases holds true across institutions.
The military needs a paradigm shift. One answer is to study the school of intelligent failure in business, which encourages people to view loss as the midwife of progressive change. A culture of intelligent failure can be embedded in all parts of the U.S. military, including the hiring and promotion process (where candidates are evaluated on their capacity to learn from past negative experiences); training programs (which distinguish between acceptable and unacceptable errors); and leadership roles (where senior figures describe their own experiences with loss). Increasing the chance of success means engaging honestly and openly with the possibility of failure.
Dominic Tierney is associate professor of political science at Swarthmore College, a Templeton Fellow at the Foreign Policy Research Institute, a contributing editor at The Atlantic and the author of four books, most recently, The Right Way to Lose a War: America in an Age of Unwinnable Conflicts (Little, Brown, 2015).