It has now been six years since the start of the subprime financial crisis, an event that many people mentally connect with the September 2008 collapse of Lehman Brothers. But in reality, the financial crisis that erupted in the financial markets that year actually began several years before. Just as the Great Crash of 1929 started with the unraveling of the Florida real-estate bubble and other events in the mid-1920s, the financial bust of 2008 was visible years before, as real-estate investors and lenders began to evidence mounting signs of stress.
Most observers connect the crisis with events in the financial markets: securities fraud by Wall Street investment houses, bad mortgages extended by lenders, and even worse, policy from Washington that encouraged an artificial increase in homeownership. But the roots of the crisis actually go far deeper than any single decision, action or policy originating in the 2000s. Thus, when people consider the financial meltdown of 2008 and ask whether the right policy tools were used to combat the crisis, the first question we need to ask is: what is the question?
Did we overreact to the crisis of 2008? Or did we underreact? Was the fiscal and monetary stimulus applied since 2008 the best way to address the crisis? What lessons can we learn from the financial meltdown? All of these questions suppose that we actually understand why the crisis occurred, and in particular, whether it was due to a set of narrow problems originating in the financial markets around the time of the event or instead part of a broader set of economic and political factors that go back decades.
You can argue, for example, that the roots of the housing crisis started with the push for “affordable housing” during the Clinton administration; in fact, the wellsprings of the crisis stretch back decades earlier. Look at the Federal Funds rate maintained by the Federal Reserve Board going back to the 1950s, when the U.S. central bank regained its independence from the U.S. Treasury after keeping interest rates artificially low to support the war effort. The history of short-term interest rates in the US not only illustrates the ebb and flow of short-term interest rates over the past sixty years, but it also serves as a surrogate for other factors—including changes in demographics and inflation—that have had a profound impact on the U.S. economy over the past six decades.
In the 1950s and 1960s, interest rates were relatively low as the nation sought to grow after decades of the Great Depression and economic constraints imposed during WWII. As the demographic bulge known as the “baby boom” grew and reach adulthood in the 1970s and 1980s, however, interest rates rose as inflation became a major public concern. During this same period, federal spending and the growth of public debt also surged, adding a new complication to the economic-policy mix.
Even as issues like inflation and the federal deficit became public-policy concerns, though, growth and job creation remained the top priorities for both political parties. In 1978, Congress passed the Full Employment and Balanced Growth Act, better known as the Humphrey-Hawkins Act, to mandate that unemployment should not exceed 3 percent for people twenty years old or older, and inflation should be reduced to 3 percent or less, provided that its reduction would not interfere with the employment goal. And the law provided that by 1988, the targeted inflation rate should be zero, again, provided that pursuing stable prices would not interfere with the 3 percent target for unemployment.
The legislation was named after two of the more liberal members of Congress at the time, Hubert Humphrey of Minnesota and Augustus Hawkins of California, two men who could be generously described as socialist in their economic orientation. Humphrey actually wanted the government to explicitly guarantee a job for all Americans, but fortunately, Congress balked at creating that level of personal entitlement. Despite the so-called dual mandate of low or no inflation, and “full employment,” since the late 1970s, the Fed has consistently used lower and lower interest rates to boost nominal job growth.
Through the S&L crisis of the 1980s, the Fed kept interest rates relatively high, but by the early 1990s, rates gradually fell and would remain at low levels through much of the decade. During this period, baby boomers reached their maximum earning years and federal deficits almost disappeared because of swelling tax rolls. By the end of the 1990s, however, a stagnant housing sector again caused Washington to resort to policy machinations to boost short-term job creation and consumer-spending growth.
The Fed under Chairman Alan Greenspan had been lowering interest rates for months prior to the September 11, 2001 terrorist attacks, but then accelerated the process of interest-rate ease and kept rates extremely low until almost 2005, when Fed funds were trading below 1 percent. By the time that the Fed started to raise interest rates late in 2004, the U.S. mortgage market was booming and the Wall Street deal machine was already spinning out of control like a runaway nuclear reactor.
Lenders would create almost $4 trillion in new mortgages in 2004. By comparison, that is four times the level of new mortgage originations expected this year. More aggressive lenders, such as Countrywide and Washington Mutual, were turning over their balance sheets several times in 2004 and accounted for more than one third of all mortgages originated that year. Investment banks, such as Lehman Brothers and Bear Stearns, were likewise running at maximum capacity in terms of deal flow, selling toxic securities to investors as fast as they could do deals. And all of these firms and many others were already doomed to fail long before the Fed managed to push short-term interest rates up to 5 percent by 2006. When the financial crisis finally broke in 2008, the Fed responded by pushing interest rates back down to levels below the 2001-2005 period, and there they have remained. Thus, when we ask whether the response to the crisis has been effective, the first thing to consider is what has changed.
The low-interest-rate environment that arguably helped cause the 2008 financial crisis remains in place today. What has changed is that government policy has shifted dramatically and now is arguably stifling job creation. New regimes, such as the Basel III bank capital rules and the 2010 Dodd Frank law, are constraining credit expansion and economic growth. The jobs that are available are mostly going to older Americans, while young people looking to start careers and families are left out. But America’s addiction to low interest rates is unaffected and is not likely to change in the near term.
Today, when Fed Chair Janet Yellen and her colleagues talk about using ultra-low interest rates to boost employment, they are taking a page from the monetary playbook that dates back to the mid-1970s, when revered statesmen like Hubert Humphrey and Augustus Hawkins thought that they could legislate economic outcomes by passing laws in Washington. In fact, while little has really changed in terms of economic policies in the wake of the 2008 financial crisis, a great deal has changed in the structure of the U.S. economy. Until our policy makers begin to take notice of that fact and start to change the way in which we approach economic policy, the outlook for growth and jobs is unlikely to change very much, and only then, despite what is said and done in Washington.
As IMF chief Christine Lagarde warns, the world economy is threatened by a “new mediocre” of low growth for a long time to come.
Christopher Whalen is the author of the 2010 book, Inflated: How Money and Debt Built the American Dream, now in its second edition from John Wiley & Sons. He has just completed a new book co-authored with Frederick Feldkamp entitled Financial Stability: Fraud, Confidence & the Wealth of Nations also published by John Wiley & Sons. Follow him on Twitter: @rcwhalen.
Image: Flickr/Ervins Strauhmanis/CC by 2.0