The 1960s were a time of fast economic growth, driven in part by fast productivity growth. (The labor force was also expanding at a rapid pace.) Plenty of economists, futurists, and policymakers thought the good times would never end. Keynes’s “economic problem” of material abundance for all was surely well along to being solved.
But even more than that, the booming economy would help bring about a future that would seem roughly in sync with the techno-optimist science-fiction of time, including The Jetsons, Star Trek, and 2001: A Space Odyssey. But then the era of fast productivity growth ended in 1973 — other than a blip up in the late 1990s and early 2000s. If Northwestern University economist Robert Gordon is right, advanced economies around the world had fully exploited the great advances of the past, such as electrification and the internal combustion engine. A “special century” of fast progress since the 1870s was over.
But what did experts back then think was going on? At first, it seemed obvious to write off it as the effect of oil shocks and rising inflation. Later, economists wondered about the impact of a flood of environmental regulation that happened in the early 1970s. But decades later, there is still no hard, clear answer. At least, there does not seem to be a single answer for that initial period of slowdown. In a May podcast chat with me, University of Chicago economist Chad Syverson said this:
So 1974 seems to be special. Since that point, we haven’t been able to really sustain — with the exception of the 10-year period I mentioned in the US — average labor productivity growth much above two percent per year. And I don’t have a really sharp sense of if that’s one explanation for that, or if there are multiple things. There are so many candidates. My guess is anything that big and that long probably is multi-causal. Which things explain it, and how important is each one? That’s a hard question.
Relatedly, earlier this year I also touched on this broad topic with Stanford University economist Peter Klenow:
Pethokoukis: Looking back, how do you think economists will remember the 2010s? As an impressive decade, since we’re in the middle of the longest economic expansion in US history? Or will they look at it as a disappointing decade in which the economy puttered along at about two percent growth, which is slower than the post-WWII average? How will they — and you — look back on the 2010s, economically?
Klenow: Well, cyclically, it looks great, like you said, in terms of the unemployment rate falling more or less continuously and employment rising. But in terms of income per worker, it’s been much more disappointing. Looking back, we had that slow growth period after the oil embargo in the early 1970s — it lasted some twenty years. And then we had this freak period of ten years of rapid growth. And maybe we told ourselves when the economy then grew slowly from like 2005 to 2010 that that was partly the after-effects of the Great Recession and the global financial crisis. So I think we would be disappointed when we look back and say that the growth rate didn’t pick back up again, with productivity growth averaging something like one percent. So in terms of output per worker per hour, it’s been really disappointing I think, from the longer-run perspective.
Is it possible we get an economy that’s this big, given the demographics, to grow at three percent going forward? Whether it comes from AI or robotics increasing productivity, is a three percent economy still possible for the United States?
I think it is, but it’s going to have to come through some sort of technological revolution, like you’re mentioning. Because if I think about the growth rate of resources we can devote to research, even if we allowed twice the H1B visas, that’s not going to double the growth rate of research activity, in a way that could feed the growth rate and push it up to two or three percent in terms of productivity. So where I think the acceleration would have to come from is a rich vein of technological progress in things like AI and machine learning.
This article first appeared at the American Enterprise Institute.