The Buzz

The Hard Truth About Thinking Like the Enemy

We’ve all encountered the cynical devil’s advocate, usually making an argument, just for argument’s sake. Varying between frustrating and entertaining, the group curmudgeon is rarely a critical part of an organization’s decision-making process. Truly divergent thinking that provides an alternative analysis is slightly more nuanced, and becomes a deliberate piece of the process for an organization faced with uncertain challenges.

In Red Team: How to Succeed by Thinking Like the Enemy, author and senior fellow at the Council on Foreign Relations, Micah Zenko presents a case for deliberately including such an advocate to challenge the major decisions of an organization’s operations, plans, strategies or procedures. His book is likely to become the entry-point where analysts seek resources to help unlock the enemy’s mind.

The red teaming concept dates back to the sixteenth century advocatus diaboli, whose role is to make the case against arguments for canonization. The term “red team” was not developed until the Cold War by teams that played the part of the “red” enemy against “blue” friendly forces in order to test out, or “war game” as it is more colloquially known, strategies and plans. Developed most extensively within the military, but practiced throughout the national security community and the private sector, red teaming has grown over the years. Zenko uses a variety of cases, from success to gross misuse, to highlight the importance of red teaming. Contrasts are made between the cases of the Al-Shifa pharmaceutical factory bombing (where no red team was utilized for competitive analysis) with the example of red-teaming in the Abbottabad mission to kill Osama bin Laden (which was “red-teamed to death”). In cases, such as Millennium Challenge 2002, Red Team discusses how the value of red teams in wargaming can be undermined when the scope is manipulated to reach a desired outcome. Zenko amassed an impressive number of interviews and cases across military, homeland security and the private sector, though the cases chosen for discussion are less an academic test of the value of red teaming than a persuasive advocacy for best practices in red teaming. Used correctly, Zenko argues that red teams might provide the critical thinking necessary to identify acute vulnerabilities and build more adaptive organizations.

 

“You Can’t Grade Your Own Homework”

The plan makes sense to you; you wrote it. Your team spent days, weeks, or even months working on it, and you are certain that every contingency is considered. Action, reaction, and counteraction are all clear. However, as Neil deGrasse Tyson has admitted, “In science, when human behavior enters the equation, things go nonlinear. That's why Physics is easy and Sociology is hard.” The number of variables in a series of human interactions are a challenge for even the most astute and deliberate teams of planners and strategists. The terrain of the marketplace of ideas can and will change over time; a competitor develops their own plan, and will react to your actions in ways you may not foresee. Grading your science homework is one thing: scientifically, can the causal finding be replicated? It is entirely different to quantifiably measure the level of confidence in your own plan, to introduce a new product into the marketplace of ideas, engage in offensive operations against an enemy, or defend your critical vulnerabilities against an attack.

Despite the best efforts of planning or strategy teams, individual or collective cognitive biases such as mirror imaging, anchoring, and confirmation bias can lead to errors that are indistinguishable to the authors of the analysis. We do not grade our own writing composition papers in grade school and, prudently, we should not allow ourselves to be the only judge of our strategies, operations, or procedures.

Strategies are likely to reach a suboptimal outcome when developed within the vacuum of an organization without the benefit of alternative analysis. Incomplete information, compounded by natural human and organizational biases can cloud the analysis of the plan’s weak points and may result in obvious vulnerabilities to an objective outsider or motivated competitor.

 

Best Practices

Pages