Ironically, the celebrity aviator Charles Lindbergh, one of the leaders of the anti-interventionist “America First” movement in the run-up to U.S. entry into World War II following the Pearl Harbor attack, agreed that the alternative to destroying the German and Japanese empires before they could be consolidated was creating an American garrison state. He thought it was a nifty idea. Lindbergh declared: “The men of this country must be willing to give a year of their lives to military training—more if necessary.” And Lindbergh called on the United States to invade its neighbors, to create a secure North American empire from which the Germans, Japanese, and others could be kept out. He demanded U.S. bases throughout North America “wherever they are needed for our safety, regardless of who owns the territory involved.”
None of this should give intellectual aid and comfort to today’s neoconservative advocates of “global democratic revolution” or “humanitarian hawks” who favor invasions of countries that do not threaten the United States to defend “human rights” or “the liberal world order.” On the contrary, both of these belligerent approaches to U.S. foreign policy are antithetical to the approach of mainstream U.S. policymakers during the first two-thirds of the twentieth century. Woodrow Wilson agonized over the effects that war might have on American society. Franklin Roosevelt told the American people, “I have seen war … I hate war.” In contrast, according to her memoirs, Madeleine Albright, favoring intervention in the war of the Yugoslav succession, a conflict of only remote and indirect interest to the United States, asked General Colin Powell: “What’s the point of you saving this superb military for, Colin, if we can’t use it?”
Both interventionists and anti-interventionists in the United States, then, have maintained that their preferred policies would minimize the long-term threat that the American republic would be replaced by a militarized, Spartan garrison state. Advocates of intervention to prevent German or Soviet hegemony in Eurasia conceded that the temporary sacrifice of a degree of liberty and democracy during a war or cold war could be harmful, but would be less harmful to the way of life of a civilian, democratic, and liberal republic than the alternative—permanent defensive militarization of American society in an environment of regional empires and recurrent world wars. The anti-interventionists disputed this argument, claiming that U.S. participation in world war or cold war would turn America into a garrison state immediately and permanently.
Which side was right? One result of the global conflicts of the twentieth century has indeed been the emergence of a large, permanent military and defense industrial base, along with a permanent and powerful intelligence community. The world wars and the Cold War diminished the authority of Congress in foreign affairs, while giving the president vastly enhanced discretion in military affairs. Even worse, the “War on Terror” that followed the Al Qaeda attacks on the United States on September 11, 2001, led a panicked Congress to delegate to the president the construction of a surveillance state which genuinely threatens the civil liberties of American citizens, who, for example, may be put on secret “no-fly lists” by government agencies without being told.
But even when these deformations of the American constitutional order are acknowledged, it is clear that the United States overall is not a garrison state. America is not an autocracy. There is no president for life; following Roosevelt’s four terms in the White House, the constitution was amended to limit a president to two terms. Indeed, two of the last four presidents have been impeached by Congress.
Conscription? Even at the height of the early Cold War in the Truman administration, proposals for universal military training were so unpopular with the public that the United States instead adopted the more limited selective service lottery system, which itself came to an end in 1973 following popular discontent with the costs of the Vietnam war.
Mobilization of industry? From Bill Clinton to Barack Obama, successive presidents, singing the praises of “globalization,” complacently ignored the deindustrialization of the United States thanks to the offshoring of industry by U.S.-based multinationals, to China in particular. They also ignored the damage done to American industry by the mercantilist policies of U.S. allies like Japan, South Korea, and Germany. When Barack Obama left office in 2017, the U.S. military had to purchase rocket engines from Russia, U.S. astronauts had to hitch rides to the international space station on Russian rockets, and America had nearly lost its capacity to make many critical tech components including silicon chips. The Covid-19 pandemic revealed that the United States was almost completely dependent on Chinese factories for many crucial medical supplies, from masks to the chemical precursors used in common drugs. What kind of garrison state makes itself dependent for industrial supplies on a hostile strategic rival like China?
Overgrown military? In 1944, U.S. defense spending engrossed a third of GDP. Then, between 1945 and 1950, it plummeted to less than 5 percent. As a share of GDP, U.S. defense spending rose to 11.3 percent at the height of the Korean War in 1953 and 8.6 percent at the height of the Vietnam War—hardly Spartan levels of military consumption. Following the end of the Cold War, defense spending dropped to around 3 percent of GDP, a number to which it has returned after a brief uptick to 4.5 percent at the apex of the wars in Iraq and Afghanistan in 2010. One may believe, as I do, that most of the small wars the United States is fighting in Afghanistan, Syria, Libya, and elsewhere are unnecessary, but a country that spends 3 percent of GDP on the military is not one in which military expenditure threatens to choke off the civilian economy.
As for the “standing army,” that nightmare of generations of Americans, the U.S. military has been radically downsized since the Cold War ended. The number of active-duty military personnel shrank from around two million in 1990 to a little more than a million today. Even when private contractors are considered, this has hardly the swollen military of a totalitarian state.
The American republic is not in danger of becoming a garrison state—not now, nor in the foreseeable future. But that is not to say the American republic is not in danger. The excessive militarization of society in a regimented state is not the only way that our republican social order can give way to a different kind of social order, with its own kind of foreign policy. America’s democratic republic could be warped to the point at which it ceases to be a democratic republic in all but name and morphs into a tributary state or a castle society.
THE BEST-KNOWN example of a tributary state, in the sense in which I am using the term, was Finland during the Cold War. The term “Finlandization” was coined by West German political scientists to describe the process by which Finland accommodated the Soviet Union in foreign policy, in order to maintain its nominal sovereignty and domestic autonomy. The term is unfair to Finland, because it is commonplace for small and weak states to pursue foreign policies that avoid provoking the wrath of powerful neighbors. This is certainly true in America’s neighborhood, where the United States in the last few generations has invaded the Dominican Republic, Grenada, Panama, and Haiti while engaging in proxy war and covert action to install client governments in the region.
The lack of attention to tributary states by theorists of international relations is puzzling, because the category includes the vast majority of regimes in recorded history. Premodern empires and kingdoms were not centralized, bureaucratic states, but loose conglomerations of semi-autonomous lesser kingdoms, satrapies, provinces, city-states, duchies, bishoprics, and other entities. Typically, in return for fealty and tribute, the imperial government allowed a high degree of internal independence in subordinate units. Sovereignty in the modern sense did not exist in such systems; there were only degrees of suzerainty.
The ubiquity of premodern tributary states, including dominions and colonies of the European empires that enjoyed various degrees of self-government before post-1945 decolonization, suggests that the assumption of academic neorealist theory that most states seek to maximize their relative power is too simple. The mistake of crude realism is to confuse states with elites. If the state is not an autonomous agent but merely the instrument of a social elite—an assumption shared by Marxists, populists, and others—then it may be in the self-interest of a dominant elite to maintain or increase its own status within its local society by sacrificing the state’s external sovereignty and making it a protectorate of another regime, particularly if the foreign protector can guarantee the security of the local ruling class against challenges from below. To secure its status, the social elite may even give up national independence altogether in favor of annexation. This is what the Scottish elite did with the Act of Union of 1707 and what the short-lived Republic of Texas decided when it joined the United States as a state in 1846.