The Buzz

5 Ways a Nuclear War Could Go Down (And Billions of People Would Die)

Turns out the Soviet high command, in its pathetic and paranoid last years, was just that crazy. The USSR built a system called Perimetrknown informally in Russia as “the Dead Hand.”  Perimetr was essentially a computer system that would watch for signs of nuclear attack and retaliate on its own if the Soviet leadership was struck first and wiped out. (I explained this is more detail for National Geographic, which you can watch here.)  We’ve since asked the Russians if it’s still on, and they’ve reassured us, with complete confidence, that we should mind our own business. Let’s hope they’re just being rude.

2. Human Error:

As long as there are machines run by human beings, there are going to be accidents. War, however, will not begin because a bomber crashes or a silo catches fire; rather, the error will lay in the misinterpretation of an accident by fallible human beings.

History is replete with such incidents. In 1995, the Russians forgot that the Norwegians had notified them of a rocket launch to put a weather satellite into space. The Russian high command told President Boris Yeltsin that they had a confirmed rocket launch from NATO over Russia. Fortunately, no one in the Kremlin assumed that Bill Clinton was trying to start World War III with a single warhead from Norway. Moreover, the warm relationship between Clinton and Yeltsin made the Russian president skeptical that Russia was under what Cold War strategists used to call a “BOOB,” or “Bolt Out Of the Blue” attack.

Similar mistakes have been provoked by flocks of birds, random computer glitches, and the sun glinting off cloud formations (which was interpreted by Soviet computers as the fiery tails of multiple U.S. missile boosters). In each case, it was up to a human being to make the call: is someone really attacking us? Smart people in both Russia and the United States have prevented these mechanical errors from turning into Armageddon.

Nevertheless, the declassified files on these incidents won’t exactly help you sleep more soundly. In 1979, for example, NORAD, the joint U.S.- Canadian North American Air Defense Command, rousted White House advisor Zbigniew Brzezinski out of bed and told him that a massive Soviet nuclear strike was incoming. Or so they thought, anyway: they were giving him a heads-up while they checked it out. Brzezinski was minutes away from waking President Jimmy Carter and handing him the codes to Hell when NORAD called back and said: Oops, nevermind. The computers goofed. Our bad. We’ll fix it.

And then it happened again in 1980.

The Soviets, who got wind of all this activity, politely sent a note to Carter asking him, in effect: What the is going on over there? It was a good question, and we’d have asked the same thing.

The key here, however, is not the technology, which makes a Fail-Safe type accidental launch nearly (but not entirely) impossible.  Rather, the danger lies with human beings, who could issue orders to retaliate in moments of duress that cannot be undone. While this is less of a risk when tensions are lower, it’s always a possibility. Combining mechanical error with the natural flaws of human judgment multiplies the risk of accidental war from an infinitesimal risk to a very real possibility.

3. A Show of Force:

As we move from mechanical errors to human agency, things actually get scarier. Machines can make mistakes, but absent an international crisis and additional confirming evidence, no one goes to war on the say-so of a malfunctioning HAL 9000. While journalists and nuclear safety experts have written some excellent books about accidental detonations and other risks, I worry far more about a conscious decision to use nuclear weapons.

The worst mistake to make about nuclear weapons is to believe that they are ordinary arms, available for military use like any other. (This is sometimes called the “conventionalizing” of nuclear weapons.) The second worst mistake, however, is to believe that nuclear weapons are magic, and that using them solves problems that are otherwise politically or strategically intractable. This second error is what leads people into thinking about things like “demonstration shots” or nuclear shows of force, in which a nuclear weapon is exploded near, but not in, a conflict.      

Ostensibly, such a dramatic demonstration will bring all of the combatants to their senses and call off whatever started the whole ruckus in the first place. (Nuclear game-players, by the way, almost never tell us why we’re at the brink. Politics is a messy business that just complicates clean, beautiful models and equations.) This idea is both seductive and dangerous: if our enemy sees our resolve, the logic goes, he will cease his predations. It is possible, of course, that a nuclear explosion could very well focus the attention of Russian, Chinese, or American leaders and invoke a “moment of clarity,” where everyone thinks very hard about what’s at stake and how much they’re willing to risk over it.