AMONG THE MORE prescient analyses of the terrorist threats that the United States would face in the twenty-first century was a report published in September 1999 by the U.S. Commission on National Security/21st Century, better known as the Hart-Rudman commission. Named after its cochairs, former senators Gary Hart and Warren Rudman, and evocatively titled New World Coming, it correctly predicted that mass-casualty terrorism would emerge as one of America’s preeminent security concerns in the next century. “Already,” the report’s first page lamented, “the traditional functions of law, police work, and military power have begun to blur before our eyes as new threats arise.” It added, “Notable among these new threats is the prospect of an attack on U.S. cities by independent or state-supported terrorists using weapons of mass destruction.”
Although hijacked commercial aircraft deliberately flown into high-rise buildings were not the weapons of mass destruction that the commission had in mind, the catastrophic effects that this tactic achieved—obliterating New York City’s World Trade Center, slicing through several of the Pentagon’s concentric rings and killing nearly three thousand people—indisputably captured the gist of that prophetic assertion.
The report was also remarkably accurate in anticipating the terrorist organizational structures that would come to dominate the first dozen or so years of the new century. “Future terrorists will probably be even less hierarchically organized, and yet better net-worked, than they are today. Their diffuse nature will make them more anonymous, yet their ability to coordinate mass effects on a global basis will increase,” the commission argued. Its vision of the motivations that would animate and subsequently fuel this violence was similarly revelatory. “The growing resentment against Western culture and values in some parts of the world,” along with “the fact that others often perceive the United States as exercising its power with arrogance and self-absorption,” was already “breeding a backlash” that would both continue and likely evolve into new and more insidious forms, the report asserted.
Some of the commission’s other visionary conclusions now read like a retrospective summary of the past decade. “The United States will be called upon frequently to intervene militarily in a time of uncertain alliances,” says one, while another disconsolately warns that “even excellent intelligence will not prevent all surprises.” Today’s tragic events in Syria were also anticipated by one statement that addressed the growing likelihood of foreign crises “replete with atrocities and the deliberate terrorizing of civilian populations.”
Fortunately, the report’s most breathless prediction concerning the likelihood of terrorist use of weapons of mass destruction (WMD) has not come to pass. But this is not for want of terrorists trying to obtain such capabilities. Indeed, prior to the October 2001 U.S.-led invasion of Afghanistan, Al Qaeda had embarked upon an ambitious quest to acquire and develop an array of such weapons that, had it been successful, would have altered to an unimaginable extent our most basic conceptions about national security and rendered moot debates over whether terrorism posed a potentially existential threat.
But just how effective have terrorist efforts to acquire and use weapons of mass destruction actually been? The September 11, 2001, attacks were widely noted for their reliance on relatively low-tech weaponry—the conversion, in effect, of airplanes into missiles by using raw physical muscle and box cutters to hijack them. Since then, efforts to gain access to WMD have been unceasing. But examining those efforts results in some surprising conclusions. While there is no cause for complacency, they do suggest that terrorists face some inherent constraints that will be difficult for them to overcome. It is easier to proclaim the threat of mass terror than to perpetrate it.
THE TERRORIST ATTACKS attacks on September 11 completely recast global perceptions of threat and vulnerability. Long-standing assumptions that terrorists were more interested in publicity than in killing were dramatically swept aside in the rising crescendo of death and destruction. The butcher’s bill that morning was without parallel in the annals of modern terrorism. Throughout the entirety of the twentieth century no more than fourteen terrorist incidents had killed more than a hundred people, and until September 11 no terrorist operation had ever killed more than five hundred people in a single attack. Viewed from another perspective, more than twice as many Americans perished within those excruciating 102 minutes than had been killed by terrorists since 1968—the year widely accepted as marking the advent of modern, international terrorism.
So massive and consequential a terrorist onslaught naturally gave rise to fears that a profound threshold in terrorist constraint and lethality had been crossed. Renewed fears and concerns were in turn generated that terrorists would now embrace an array of deadly nonconventional weapons in order to inflict even greater levels of death and destruction than had occurred that day. Attention focused specifically on terrorist use of WMD, and the so-called Cheney Doctrine emerged to shape America’s national-security strategy. The doctrine derived from former vice president Dick Cheney’s reported statement that “if there’s a one percent chance that Pakistani scientists are helping Al Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response.” What the “one percent doctrine” meant in practice, according to one observer, was that “even if there’s just a one percent chance of the unimaginable coming due, act as if it’s a certainty.” Countering the threat of nonconventional-weapons proliferation—whether by rogue states arrayed in an “axis of evil” or by terrorists who might acquire such weapons from those same states or otherwise develop them on their own—thus became one of the central pillars of the Bush administration’s time in office.
In the case of Al Qaeda, at least, these fears were more than amply justified. That group’s interest in acquiring a nuclear weapon reportedly commenced as long ago as 1992—a mere four years after its creation. An attempt by an Al Qaeda agent to purchase uranium from South Africa was made either late the following year or early in 1994 without success. Osama bin Laden’s efforts to obtain nuclear material nonetheless continued, as evidenced by the arrest in Germany in 1998 of a trusted senior aide named Mamdouh Mahmud Salim, who was attempting to purchase enriched uranium. And that same year, the Al Qaeda leader issued a proclamation in the name of the “International Islamic Front for Fighting the Jews and Crusaders.” Titled “The Nuclear Bomb of Islam,” the proclamation declared that “it is the duty of Muslims to prepare as much force as possible to terrorize the enemies of God.” When asked several months later by a Pakistani journalist whether Al Qaeda was “in a position to develop chemical weapons and try to purchase nuclear material for weapons,” bin Laden replied: “I would say that acquiring weapons for the defense of Muslims is a religious duty.”
Bin Laden’s continued interest in nuclear weaponry was also on display at the time of the September 11 attacks. Two Pakistani nuclear scientists named Sultan Bashiruddin Mahmood and Abdul Majeed spent three days that August at a secret Al Qaeda facility outside Kabul. Although their discussions with bin Laden, his deputy Ayman al-Zawahiri and other senior Al Qaeda officials also focused on the development and employment of chemical and biological weapons, Mahmood—the former director for nuclear power at Pakistan’s Atomic Energy Commission—claimed that bin Laden’s foremost interest was in developing a nuclear weapon.
The movement’s efforts in the biological-warfare realm, however, were far more advanced and appear to have begun in earnest with a memo written by al-Zawahiri on April 15, 1999, to Muhammad Atef, then deputy commander of Al Qaeda’s military committee. Citing articles published in Science, the Journal of Immunology and the New England Journal of Medicine, as well as information gleaned from authoritative books such as Tomorrow’s Weapons, Peace or Pestilence and Chemical Warfare, al-Zawahiri outlined in detail his thoughts on the priority to be given to developing a biological-weapons capability.
One of the specialists recruited for this purpose was a U.S.-trained Malaysian microbiologist named Yazid Sufaat. A former captain in the Malaysian army, Sufaat graduated from the California State University in 1987 with a degree in biological sciences. He later joined Al Gamaa al-Islamiyya (the “Islamic Group”), an Al Qaeda affiliate operating in Southeast Asia, and worked closely with its military operations chief, Riduan Isamuddin, better known as Hambali, and with Hambali’s own Al Qaeda handler, Khalid Sheikh Mohammed—the infamous KSM, architect of the September 11attacks.
In January 2000, Sufaat played host to two of the 9/11 hijackers, Khalid al-Midhar and Nawaf Alhazmi, who stayed in his Kuala Lumpur condominium. Later that year, Zacarias Moussaoui, the alleged “twentieth hijacker,” who was sentenced in 2006 to life imprisonment by a federal district court in Alexandria, Virginia, also stayed with Sufaat. Under KSM’s direction, Hambali and Sufaat set up shop at an Al Qaeda camp in Kandahar, Afghanistan, where their efforts focused on the weaponization of anthrax. Although the two made some progress, biowarfare experts believe that on the eve of September 11 Al Qaeda was still at least two to three years away from producing a sufficient quantity of anthrax to use as a weapon.Image: Pullquote: As mesmerizingly attractive as nonconventional weapons remain to Al Qaeda and other terrorist organizations, they have mostly proven frustratingly disappointing to whoever has tried to use them.Essay Types: Essay