In the years following the end of the Second World War, the wary postwar states slowly confronted the imminent realties of the new nuclear age. A world where entire countries can be erased at a moment’s notice is, as writer Christopher Hitchens put it, a world where every single person has been conscripted into military service—whether they realize it or not.
Difficult as it is to imagine in 2020, the Cold War spanned a global, decades-long debate over the feasibility of “ethical” nuclear weapons. This is the story of the neutron bomb.
With the Cold War competition well underway by the late 1950’s, the United States—which relied on nuclear deterrence to offset Soviet conventional superiority in Europe—was in search of a new generation of nuclear weapons. Washington’s European partners were naturally uneasy with this approach. It meant, after all, that any major armed conflict between North Atlantic Treaty Organization and the Warsaw Pact could possibly lead to the detonation of nuclear weapons over West-Central Europe.
In 1958, American physicist Samuel Cohen presented Washington with something of a compromise solution. Cohen pioneered the working concept of a nuclear bomb that would maximize lethal radiation while minimizing the physical blast accompanying its detonation. This low-yield bomb would emit a powerful wave of radiation within a radius of 1,000-2,000 meters, dealing lethal doses of radiation to everyone in range but causing relatively little damage to structures and mitigating the impacts of residual radiation. There was further speculation that the bomb could be effective as a countermeasure against ballistic missiles, potentially melting or otherwise disabling enemy warheads.
The neutron bomb, as Cohen’s invention came to be called, was successfully tested in 1962. Still, the Carter administration demurred. As a skeptical media drew increased attention to the bomb’s existence, President Jimmy Carter seemed to want to offload political responsibility for the bomb’s possible deployment onto the West Europeans. Despite considerable support for the project in Washington and throughout other major European capitals including Berlin, the neutron bomb faced relentless opposition from the western media and activist groups. The Soviet Union joined in on the messaging war, with Soviet Leader Leonid Brezhnev famously calling it a “capitalist bomb”—that is, a weapon designed to kill people while sparing property. In a reversal that stunned even his closest advisors, Carter shelved the neutron bomb’s deployment following mass protests across the transnational west. The program was briefly revived by President Ronald Reagan, but the neutron bomb debate again faded from public view with Reagan’s introduction of the Strategic Defense Initiative (SDI). The issue became moot in the following decade with the Soviet collapse and 1990’s drawdown of U.S. tactical nuclear weapons deployment in Western Europe.
Curiously, Communist China developed a “capitalist bomb” of its own toward the 1980’s. China never deployed its neutron bomb, but instead kept it in a state of “technology reserve”—it seems that Beijing never had concrete battlefield plans for its neutron bomb, but was simply interested in achieving technological parity with Washington.
Nuclear weapons technology has moved on in the decades following the Soviet collapse, but the basic doctrinal question remains: what is the most effective way to maximize enemy battlefield casualties while minimizing structural damage? Both Moscow—which heavily relies on tactical nuclear weapons stockpiles as a source of deterrence—and Washington continue to experiment with low-yield tactical nuclear weapons to the present day.
Mark Episkopos is the new national security reporter for the National Interest.