On War and Choice

On War and Choice

Mini Teaser: It has long been said that there are wars of necessity and wars of choice. But enemies always adapt, especially in our world of terrorists, failing states and delinquent regimes. Every war is a war of choice.

by Author(s): Lawrence Freedman

 WARS ARE now commonly divided into types: those of necessity and those of choice. The former are unavoidable, fought because of a threat to our basic way of life. The latter are discretionary. There is no strategic imperative. These are the wars of regime change and humanitarian intervention. The distinction implies an underlying shift in international affairs, away from basic threats to our security and toward more complex challenges. If our rivals are less likely to pick a fight with us, does that give us more latitude to pick fights with others?

And yet this distinction is now well established. It peppers newspaper articles and Internet debates. I once thought I was responsible for this turn of events, but I am now aware of a number of paternity claims. Perhaps inevitably, New York Times columnist Thomas Friedman has one. Richard Haass, president of the Council on Foreign Relations, used the distinction in the title of his memoir of the two wars against Iraq. The main responsibility probably lies with the late Israeli Prime Minister Menachem Begin. In an August 1982 speech, he described, apparently proudly, that year's Israeli invasion of Lebanon to push Palestine Liberation Organization forces out of the country, eventually surrounding Beirut, as being one of choice. He contrasted this with wars of "no alternative" for Israel, such as the war of independence and of October 1973 when the country was attacked by Egypt and Syria. On the other hand, in 1956 when Israeli troops entered the Sinai Peninsula, 1967 when Israel destroyed the Egyptian Air Force during the Six-Day War and now 1982, Israel had taken the initiative. The purpose was preemptive, and so ultimately defensive, but Begin acknowledged that neither in 1956 nor in 1982 was the ultimate security of the state at risk.

Begin, in turn, appears to have gotten the idea from the twelfth-century Jewish philosopher Maimonides who struggled with these same concepts. Maimonides distinguished between an "obligatory" war, essential to a state's survival, and a "discretionary" or "voluntary" war undertaken to extend its borders for the purposes of "greatness and reputation." This suggests a simple distinction between defensive and offensive conflicts. Subsequent rabbinical debates looked deeper into the questions of when and under what circumstances a state should bring force to bear, addressing the difficult and still-perplexing questions of both preemption and humanitarian intervention. When a failure to strike first would put the state at great risk, the rabbis were clear: there was no requirement to wait for final proof of an enemy's plan to attack. Better to go on the offense than to face a difficult defense.

There was more ambiguity when it came to whether there was an obligation to rescue a victim under attack. We would now, in the form of humanitarian intervention, put helping others clearly under the "choice" heading. This contradicts the rabbinical implication that wars of choice are inherently aggressive, though of course that is still commonly the view of those on the receiving end, who will claim that humanitarian justifications for military action are no more than pretexts. The rabbis were certainly sure that if embarking on a war of choice, the sovereign should be convinced of his case and consult carefully before acting. Such a move should only be undertaken in exceptional circumstances and with a compelling rationale.

War, therefore, should never be chosen casually. Precisely because there is a choice, costs and gains must be weighed especially carefully.


I STARTED to use this war-of-choice war-of-necessity dichotomy in the mid-1990s. With Western military superiority so well established, it seemed less likely that other actual or aspiring major powers would pick fights with the United States and its allies, but there were evidently regular instances where force was considered for largely humanitarian purposes. I intended to give the idea a push in a monograph entitled The Revolution in Strategic Affairs, published by the International Institute for Strategic Studies (IISS) in 1998.

My immediate target was the idea that a revolution in military affairs was now fully under way, led by the United States. War planners argued that future operations would move forward well beyond what was already possible in the 1991 Iraq War, when America trounced Saddam Hussein's army, to a completely new level through the integration of precision-guided weapons with new information and communication technologies. Exploiting them would make it possible to employ forces with extraordinary efficiency and accuracy. Armies would move with speed and agility, calling in firepower whenever needed rather than carrying it with them-America was undefeatable.

There was no denying the potential significance of these new technologies. What was less clear was whether these state-of-the-art capabilities would redefine the nature of war all on their own and thereby allow us to predict the character of future conflicts. Much would still depend on the political context in which they were introduced. It is true that the revolutionary vision promised rather one-sided affairs; it was unclear which potential enemies could begin to fight in the way proposed. Certainly, no other country or collection of countries was close to matching the United States in conventional capabilities. Yet what followed was not that America would inevitably crush all of its enemies but rather that adversaries would find new ways of fighting. It was therefore likely that those in conflict with the Americans would use other methods short of challenging them to an unwinnable regular war.

Nor should it have been assumed that the new technologies would only support one way of fighting. A crude example illustrates why. Precision weaponry has been welcomed because it makes it possible to score direct hits on strictly military targets, thereby reducing the risk of harm to innocent civilians. If, however, the intent is to maximize civilian damage, then precision can still be an advantage. The accuracy that makes it possible to miss hospitals, power stations or cultural sites also makes it easier to hit them if that is the aim-the suicide bomber attacking an armored vehicle or a school bus, the 9/11 pilots gunning for the Twin Towers and the Pentagon. Even in the 1990s it was evident that in irregular forms of warfare, local people would become shields, sanctuaries and targets, and outcomes would be determined by the effects on civilian attitudes and behavior. Moreover, many of the key new information and communication technologies, whether satellite navigation, laptop calculations or cell phones, were widely available at low cost. Options were opening up for all forms of violence, and not just regular warfare.


ENEMIES ALWAYS adapt, and politics always matter. That is why, after Iraqi forces were chased out of Kuwait, American military preeminence did not quite translate the way the revolution-in-military-affairs enthusiasts predicted. The problems that followed were not of the aspirant-great-power type, with states flexing their muscles and challenging the geopolitical status quo; instead, they were connected with dysfunctional countries, regimes turning on their own people, broken state structures and incipient civil wars leading to intercommunal violence, sometimes of a most vicious sort.

It was because of such changes in the strategic environment that so much of the debate in the 1990s turned into questions about the extent to which it was possible or appropriate for Western countries to intervene in these situations. Precisely because the claims went back and forth, with compelling arguments in favor of both getting in and staying out, these humanitarian interventions became the archetypal "wars of choice." And the insistent pressure to act became part of a wider argument about whether the military should prepare solely for "big wars" or start to adapt training and doctrine to enable it to cope with operations that had previously been dismissed as "other than war." The United States generally erred on the side of caution. It bailed out of Somalia once peacekeeping forces became embroiled in interclan warfare, not long after the "Blackhawk" incident in October 1993, and the next year notably failed to get involved during the genocide of the Tutsis by the Hutus in Rwanda. Even as the former Yugoslavia fell apart after 1991 and in the face of Serbian "ethnic cleansing" first against Croatians and then against Bosnian Muslims, the American preference was to offer airpower rather than infantrymen though a number of allied forces were heavily involved on the ground. There was no shortage of difficult conflicts that raised questions of obligation; and when it came to getting involved, especially with land forces, the political will was notably weak.


ONE REASON for this was the skeptical view taken by many in the American military establishment. Colin Powell, as chairman of the Joint Chiefs of Staff from 1989 to 1993, spoke disparagingly of "constabulary" duties. The armed forces of the United States, the argument went, should be saved for big wars against proper enemies. This thinking reflected the scars of Vietnam, a formative experience for Powell and the other senior commanders of his generation. They deemed the type of counterinsurgency campaign unsuccessfully practiced in Southeast Asia to have been a massive and painful distraction. After Vietnam, the military supported a return to the army's core business of preparing to deal with major Communist offensives in Europe and Asia.

Image: Pullquote: Talking of a war of choice opens up debate; asserting a war of necessity closes it down. Choices can be good or bad; necessity chooses for you.Essay Types: Essay