America's Innovation Edge Is at Risk

Only the government has the resources and long-term interest to accept the risks inherent in funding basic science.

It seems a growing number of people refuse to accept living in a post-fact world. Saturday’s March for Science was billed by its organizers as necessary “to defend the vital role science plays" in our health, safety, economies, and governments.”

It is no coincidence that the march was held during the opening weeks of Donald Trump’s presidency, and on Earth Day to boot. Many of the nation’s scientists fear a serious threat to both science itself and federal policies rooted in science. Look no further than the hatchet the Trump administration hopes to take to the federal government’s research and development budgets, or to its goal of eviscerating or weakening evidence-based regulations across a number of areas, most prominently in environmental policy.

How has it come to this? The fact that scientists feel a need to organize mass marches is a sign of something gone very wrong. As in so many areas of contemporary American life, science has become a political football. For whatever reason, a sizable portion of our body politic no longer equates science with progress and prosperity. That sentiment has been years in the making.

This situation is dangerous because it undermines America’s preeminent role in the world, writ large. Principally, it endangers the most successful tech-driven innovation system in human history. Since the early postwar period, that system has consisted of three main interlocking components: pure lab-based scientific research, practical technological development and commercially viable innovation. Many of the amazing technologies we take for granted, including silicon chips, the computer, the internet, the global positioning system (GPS) and mobile communications, came from this innovation system. Some of these technologies are critical to addressing national-security threats and well-being: whether advanced lasers to shoot down North Korean missiles, biotech breakthroughs to cure cancer or prevent Alzheimer’s, or clean energy to help solve climate change.

After World War II, the U.S. government, together with state and local governments, research universities and the private sector, set into motion America’s modern innovation system. The initial goal was outlined in a 1945 report by Vannevar Bush, one of America’s top engineers and a key administrator behind the Manhattan Project. Titled Science—The Endless Frontier, Bush’s report drew on wartime experience and called for a robust postwar federal presence in scientific research. The primary motive was obvious: maintaining America’s military edge, hence its geopolitical advantage, over all rivals would require sustained technological superiority over them as well. Building on this logic, over the following decades the federal government crafted the basic scientific research infrastructure that would enable much of the nation’s technical innovation. The federal government became the nation’s primary investor in basic scientific research, principally through funneling merit-based research grants through universities and a burgeoning series of federal research laboratories.

At different points during the Cold War, the federal government strengthened this presence in American science. For example, in 1957 the Soviet Union’s Sputnik satellite launch frightened the U.S. government and much of America’s citizenry to boot. A few months later, President Eisenhower created a risk-taking, forward-thinking research agency, known as the Defense Advanced Research Projects Agency, and tasked it with ensuring that the United States be the world’s greatest technological disruptor. DARPA has been behind America’s greatest postwar inventions, including the internet and GPS. It funnels huge amounts of federal dollars through the nation’s universities and federal research labs, much of which is spent on blue-sky ideas that may or may not pay off—which is the point of risk-taking, after all.

Public research funding is essential because only the government has both the resources and long-term interest to accept the risks inherent in funding basic science, the wellspring of all technological breakthroughs. The private sector, by contrast, tends to invest more of its research dollars into technologies that are almost mature and can be commercialized near-term given that basic scientific research may never pay off. The public and private roles are therefore complementary: the government funds scientific research that gives rise to technological breakthroughs, while the private sector focuses on developing the most commercially viable technologies.

That brings us to the present day. The Trump administration, through its proposed budget, appears to be unaware of either this history or the consequences of kneecapping American science. Sadly, the president’s budget is but the latest and most visible attempt to erode the research and development (R&D) components of America’s innovation system. Over the past several decades, federal funding for R&D has been declining in relative terms (defined as federal research spending as a share of the nation’s GDP), from around 1.2 percent in 1976 to about 0.8 percent in 2016.

Pages