World War II seems like a pretty obvious example of successful industrial policy, at least in the sense of government directing science research toward specific goals. This from the new working paper “Organizing Crisis Innovation: Lessons from World War II” by Daniel P. Gross and Bhaven N. Sampat: “The [Office of Scientific Research and Development]’s priorities were demand-driven, focused on solving specific military problems, and led by input from the Armed Services. The bulk of its work was applied in nature, and while basic studies were sometimes needed, the urgency of the crisis meant that it mostly had to take basic science as given and to put it to work.”
And Washington’s effort at Big Science produced many notable successes. In just a half-decade, the paper notes, there were major advances across a range of technologies: radar, electrical engineering, jet propulsion, optics, chemistry, and atomic fission. That final one, of course, was the Manhattan Project, which produced the atomic bomb. Certainly, such an impressive roster gives much comfort to those today calling for increased science funding, especially for “moonshot” projects to solve specific problems in areas such as clean energy, dealing with pandemics, lifespan longevity, and space exploration and exploitation. Indeed, I would note that Joe Biden’s website calls for “a New $300 Billion Investment in Research and Development (R&D) and Breakthrough Technologies — from electric vehicle technology to lightweight materials to 5G and artificial intelligence.”
But it must also be noted that World War II innovation drew upon a deep reservoir of existing scientific knowledge. It was forced to. There was no time to embark upon novel and uncertain efforts at new discovery. As Gross and Sampat conclude: “An additional lesson is thus the importance of strategically investing in science and technology in regular times, to draw on in crises. This may include investing in basic research and developing the scientific workforce, cataloguing top individuals and organizations to enlist for unplanned urgent research problems, and insuring supply chains.”
While the historical importance of basic research should make it an easy political sell — especially if new ideas are getting harder to find — it often isn’t. Research projects with little obvious relevance to real-time, real-world problems can be easily attacked as wasteful spending. Others will charge that it crowds out private investment. But one recent study finds that “government-funded R&D in general—and defense R&D in particular—are effective at raising a country’s total expenditures on innovation in a given industry. The ultimate effect of government-funded R&D on overall R&D significantly exceeds its dollar value because government-funded R&D stimulates additional R&D investment on the part of the private sector.”
So crowding in rather than crowding out. And this from the Congressional Budget Office: “In CBO’s judgment, federal spending for R&D has a small but noticeable positive effect on the amount of money the private sector spends on R&D.”
Yet I also worry about a lack of venturesomeness with government efforts, an issue that suggests government funding models should also include things like prizes and lotteries, as a new Niskanen Center report suggests. Of course, these alternate funding models could also be financed privately. As innovation researcher Jason Crawford has noted, “A remarkable amount of progress in history has been made by individuals researching or inventing outside the context of any formal institution.” This is one reason to be skeptical of those eager to confiscate wealth and cut off a source of non-governmental funding for innovative projects.
This article first appeared in 2020 on the AEI Ideas blog.