EARLIER THIS year, West Point’s Defense and Strategic Studies Program invited me to participate in a panel discussion on the future of warfare. For historians, and particularly for Vietnam War students like me, such requests seem fraught with peril. Given the contentious debate that continues to surround America’s involvement in Vietnam, now fifty years after Lyndon Johnson’s fateful decision to send ground combat troops to Southeast Asia, commenting on the future of warfare obliges conjecture without much evidence. Yet for uniformed officers considering strategic issues and the use of military force, these questions surely are as sensible as they are unavoidable. How can soldiers prepare for future war without thinking about its latest incarnations?
The guidance for the panelists underlined two questions: “What will be the dominant trend in warfare from 2015–2035?” and “How should the U.S. military and government prepare for this trend?” Perhaps shying away from such an imposing query, I found myself dissecting the question itself. The prompt contained a host of assumptions and deeper questions. Would there be, for instance, only one dominant trend over the next twenty years? Could one find in the United States’ last thirteen and a half years of war a certain trajectory of technological or political developments hinting at the future of warfare?
Most importantly, the question seemed to assume, almost reflexively, that the United States would be at war over the next twenty years. (Peace, apparently, was not likely to be a dominant trend.) Such assumptions should give us pause. Yet preparing for war—even engaging in war—without asking why war is necessary has arguably become part of our national psyche. In a large sense, the United States has been at war for so long that, collectively, its citizens and leaders have become uncomfortable with, if not frightened by, the very idea of peace. After decades of being at war, we have come to the point where we can’t live without it.
This willing acceptance of perpetual war offers a congenial (and lucrative) market for national-security visionaries who glance into the future and offer advice on defense-related topics ranging from cyberwarfare to the use of drones. Pundits offer advice on the “militarization of cyberspace” and the likely arms race that will ensue given the United States’ reliance on drone technology in counterterrorism operations. Other oracles, such as David Kilcullen, have placed their forecasts within an operational environment they see as increasingly crowded, urban and connected, much different from the remote and rural Afghanistan in which Americans have been bogged down for over a decade. Still others, like former British Army officer Robert Johnson, have highlighted Western military officers’ concerns over the legal aspects of wars in which they “will be too constrained to maneuver at all in the future.”
Of course, we should not conflate war and defense. Arguably all nations require a defense strategy, even in times of peace. Yet too few of the predictions on war’s future offer meaningful explanations of the necessity of perpetual war. Rather, they content themselves with statements about national vulnerabilities, the need to meet impending threats (real or hypothetical), or military requirements to keep the country safe. The 2015 National Security Strategy, published in February, offers a case in point. While acknowledging America’s growing economic strength and the benefits of moving beyond the large ground wars in Iraq and Afghanistan, the document stresses the “risks of an insecure world.” Despite its global power and reach, the United States, we are told, faces a “persistent risk of attacks.” The escalating challenges are manifold—threats to the nation’s cybersecurity, aggression by Russia, rising violent extremism and an evolving danger posed by the catchall menace of “terrorism.” We live in a dangerous world, the document’s authors say, one in which only vigilant nations—led, naturally, by the United States—preemptively rooting out evil can survive.
Perhaps unsurprisingly, explanations of the necessity of war have tended to downplay the economic aspects of global engagement. Americans traditionally have been uncomfortable with the word “empire,” even if its current form suggests securing economic access abroad rather than promoting traditional colonialism. Andrew J. Bacevich’s diagnosis that the purpose of American grand strategy, since at least the early 1990s, has been to create “an open and integrated international order based on the principles of democratic capitalism, with the United States as the ultimate guarantor of order and enforcer of norms” can seem jarring. Rather more appealing to most are President George W. Bush’s remarks on the fifth anniversary of the September 11, 2001, attacks. In three paragraphs alone, the president employed the word “freedom” ten times. Terrorists feared freedom. Evil enemies, we were told, hated freedom, rejected tolerance and despised dissent. Americans, however, were “advancing freedom and democracy as the great alternatives to repression and radicalism.” War meant liberty triumphing over evil rather than promoting the nation’s economic interests abroad. And so on.
IF OUR compulsion for war cannot be explained fully using the lofty terms of liberty and freedom, some scholars highlight the potential consequences of a growing divide between civilian policy makers and a professional military caste. The volunteer armed forces of the United States, increasingly professional and isolated from civilian Americans, have become, in the words of Peter D. Feaver, Richard H. Kohn and Lindsay P. Cohn, “more alienated from, disgusted with, and even hostile to civilian society.” Sentiments such as these preceded more than a decade of war in which U.S. soldiers increasingly have defined themselves as a special, if not exceptional, community apart from, or even superior to, the larger population they have been entrusted to defend. The implications of this civil-military gap on the propagation of war are not inconsequential.
The Atlantic’s James Fallows, for example, argues, “America’s distance from the military makes the country too willing to go to war, and too callous about the damage warfare inflicts.” Wastefully spending money on weapons unrelated to “battlefield realities,” our military-industrial complex instead purchases hardware based on an “unending faith that advanced technology will ensure victory.” In collaboration, an uninformed public, estranged from the soldiers who ostensibly protect it, instinctively throws its support behind policy makers harping for increased military spending and an interventionist foreign policy. To do otherwise puts one at risk of being branded weak, cowardly or even un-American.
As the distance between soldiers and civilians has grown, Americans have become less troubled with the idea of permanent war. As early as 1995, the historian Michael Sherry documented the militarization of American life, a decades-long trajectory originating before World War II in which “war defined much of the American imagination” and “the fear of war penetrated” American society. Though Sherry ended on a guardedly hopeful note—that Americans might “drift away from their militarized past”—more recent critics, like Bacevich, have denounced our society’s increasingly comfortable relationship with war. Extending Sherry’s analysis beyond the events of September 11, Bacevich persuasively maintains that the seduction of war overpowers rational thinking on the possibilities and, more importantly, limitations of military power abroad. Instead, we instinctively equate American superiority with military superiority.
Arguments asserting America’s political and cultural superiority based on its military might surely make for gripping reading. Yet within this line of reasoning rests a good deal of hubris. How many U.S. soldiers recently returned from war convinced they were exceptional, not just from the American public sending them to war, but from the Iraqis and Afghans among whom they fought? Lost in the political debates surrounding Clint Eastwood’s retelling of Chris Kyle’s American Sniper is the similitude of soldiers’ attitudes toward “the other.” Was Kyle representative of his peers when he deemed his adversaries in Iraq a “savage, despicable evil”? Did most veterans return home believing the world was a “better place without savages out there taking American lives”? It is likely that even soldiers traumatized by their experience in war arrived back in the United States with a renewed sense of superiority for American culture and values.
In this sense, our national infatuation with war can be partly explained by how it appears to ennoble us, even—perhaps especially—on a personal level. This conviction is hardly novel. As Kristin Hoganson has recounted of the Spanish-American War, martial endeavors overseas were seen as a way to “vitalize” American manhood. The nation would profit as tested veterans evolved into model citizens and leaders. Little room was afforded to countervailing views on war’s capacity to build robust men for a strong, globally minded republic. Thus, according to Hoganson, “Imperialists benefited from the widespread tendency to construe opposition to war as a sign of cowardice, weakness, or other supposedly unmanly attributes.”
This seemingly anachronistic rendering of gender norms perseveres within American society. We still believe, based more on conviction than evidence, that war fosters masculine values while promoting freedom at home and abroad. Sebastian Junger, who spent over a year chronicling an American infantry platoon serving at a remote outpost in eastern Afghanistan, found that war remains a rite of passage for some teens searching for the surest path to manhood. “An extremely compelling endeavor for a lot of young men,” war supposedly gives their lives meaning. Thus, the powerful narrative of war draws us in, captivates our imagination and offers opportunities to prove our worth. “We all want peace,” Junger asserts, “but we’re all fascinated by the drama of war.”