The Hard Truth About Thinking Like the Enemy

March 23, 2016 Topic: Security Blog Brand: The Buzz Tags: U.S. MilitaryStrategyBusinessSuccessRed Team

The Hard Truth About Thinking Like the Enemy

Micah Zenko's new book makes it easy to buy in to the concept of red teaming.

We’ve all encountered the cynical devil’s advocate, usually making an argument, just for argument’s sake. Varying between frustrating and entertaining, the group curmudgeon is rarely a critical part of an organization’s decision-making process. Truly divergent thinking that provides an alternative analysis is slightly more nuanced, and becomes a deliberate piece of the process for an organization faced with uncertain challenges.

In Red Team: How to Succeed by Thinking Like the Enemy, author and senior fellow at the Council on Foreign Relations, Micah Zenko presents a case for deliberately including such an advocate to challenge the major decisions of an organization’s operations, plans, strategies or procedures. His book is likely to become the entry-point where analysts seek resources to help unlock the enemy’s mind.

The red teaming concept dates back to the sixteenth century advocatus diaboli, whose role is to make the case against arguments for canonization. The term “red team” was not developed until the Cold War by teams that played the part of the “red” enemy against “blue” friendly forces in order to test out, or “war game” as it is more colloquially known, strategies and plans. Developed most extensively within the military, but practiced throughout the national security community and the private sector, red teaming has grown over the years. Zenko uses a variety of cases, from success to gross misuse, to highlight the importance of red teaming. Contrasts are made between the cases of the Al-Shifa pharmaceutical factory bombing (where no red team was utilized for competitive analysis) with the example of red-teaming in the Abbottabad mission to kill Osama bin Laden (which was “red-teamed to death”). In cases, such as Millennium Challenge 2002, Red Team discusses how the value of red teams in wargaming can be undermined when the scope is manipulated to reach a desired outcome. Zenko amassed an impressive number of interviews and cases across military, homeland security and the private sector, though the cases chosen for discussion are less an academic test of the value of red teaming than a persuasive advocacy for best practices in red teaming. Used correctly, Zenko argues that red teams might provide the critical thinking necessary to identify acute vulnerabilities and build more adaptive organizations.

 

“You Can’t Grade Your Own Homework”

The plan makes sense to you; you wrote it. Your team spent days, weeks, or even months working on it, and you are certain that every contingency is considered. Action, reaction, and counteraction are all clear. However, as Neil deGrasse Tyson has admitted, “In science, when human behavior enters the equation, things go nonlinear. That's why Physics is easy and Sociology is hard.” The number of variables in a series of human interactions are a challenge for even the most astute and deliberate teams of planners and strategists. The terrain of the marketplace of ideas can and will change over time; a competitor develops their own plan, and will react to your actions in ways you may not foresee. Grading your science homework is one thing: scientifically, can the causal finding be replicated? It is entirely different to quantifiably measure the level of confidence in your own plan, to introduce a new product into the marketplace of ideas, engage in offensive operations against an enemy, or defend your critical vulnerabilities against an attack.

Despite the best efforts of planning or strategy teams, individual or collective cognitive biases such as mirror imaging, anchoring, and confirmation bias can lead to errors that are indistinguishable to the authors of the analysis. We do not grade our own writing composition papers in grade school and, prudently, we should not allow ourselves to be the only judge of our strategies, operations, or procedures.

Strategies are likely to reach a suboptimal outcome when developed within the vacuum of an organization without the benefit of alternative analysis. Incomplete information, compounded by natural human and organizational biases can cloud the analysis of the plan’s weak points and may result in obvious vulnerabilities to an objective outsider or motivated competitor.

 

Best Practices

As with any endeavor conducted regularly across a broad spectrum of time and space, one of the challenges to practitioners is to develop both best practices that are effective across iterations, and principles that endure over time. While Zenko notes that the actual practitioners of red teaming bristle at the concept of “best practice,” he attempts to distill from their experiences a set of practices that apply across the variety of red teams he studied. Much like the “paradoxes of counterinsurgency,” Zenko’s list of red team best practices include a great deal of balance and judgment. To be effective, he argues red teamers must be “outside and objective, but inside and aware” and that an organization must “red team just enough, but not more.” Through examining these “best practices” across a variety of cases, Zenko is attempting to develop enduring principles that can apply more broadly.

For each principle, Zenko does an excellent job of identifying a variety of entertaining and informative anecdotes across a range of military and corporate organizations that have succeeded or failed based on the principle. It is these cases, and the principles Zenko distills from them, that make Red Team a worthy read for any leader that is considering the concept of alternative analysis. 

Reading Red Team will not teach you the skills required to be a red teamer (that would require another book, or an entire course of study), but you will learn some of what not to do, and why. For those that consider themselves smart enough to be the next red team practitioner, Zenko warns that it takes a certain personality, and you may not be “oddball,” divergent, or fearless enough. Although the military seeks to train “red teamers” in the middle of their career when thinking traits may already be ingrained, perhaps the training could be more effective if developed in stages earlier in a leader's career.

 

The Boss Must Buy In

If nothing else, Red Team makes it easy to buy in to the concept of red teaming. Having never received formal training in red teaming, but being predisposed toward the concept of “thinking like the enemy,” I found the book to be an excellent introduction to the topic that will become the starting point for those interested in learning about the concept. Allocating resources to red teaming can be costly, access to necessary information can be frustrating, and the importance of the boss’s support is critical to the success of any alternative analysis team. If the boss does not support the red team, and is not open to the criticism uncovered by its analysis, it can be a fruitless endeavor. Zenko provides a handy roadmap to the mistakes of others as a guide to future organizations. In an arena where failure results in loss of life or treasure, Red Team will be a critical resource for leaders that want to give their organization the best chance at success. 

Christopher G. Ingram is a U.S. Army officer. The views expressed in this article are those of the author and do not reflect the official policy or position of the U.S. Army, Army National Guard, Department of Defense, or U.S. Government. This article first appeared in the Bridge.

Image: Flickr/U.S. Army.