The Myth of America’s Great Stagnation

The Age of Innovation Isn’t Over

Is the great age of American economic growth over? You’d be forgiven for thinking so. Despite recovering job growth—the US economy added an estimated 203,000 jobs in November—the United States is likely to experience slower GDP growth in the decades ahead. Since 1960, the rate has been 3.3 percent. But the Federal Reserve predicts a rate of 2.1 to 2.5 percent in the future, and JPMorgan even projects a rate of less than 1.75 percent. The longer trajectory is grim: US economic growth has been gradually decelerating for decades, from a 70-year average of 3.6 percent (1939-2009) to a 10-year average of just 1.9 percent (1999-2009).

Prominent economists from Robert J. Gordon to Tyler Cowen see these numbers as signs of a long-term economic slowdown for years to come. In a Nov. 8 discussion at the International Monetary Fund, former Treasury Secretary Lawrence Summers even speculated that the United States might be stuck in “secular”—that is, long-term—“stagnation.” Gordon put the point more forcefully in a widely circulated paper with the all-caps headline “IS U.S. ECONOMIC GROWTH OVER?”

It’s true—long-term growth is slowing. But it’s not all bad news. Remember that slow growth is not no growth; any rise in per capita GDP helps boost the standard of living. And good policies that improve productivity have the potential even to boost overall growth.

GDP growth is driven principally by two factors: labor-force growth, due to increases in population and labor force participation, and productivity growth, the ability to produce more goods and services using the same number of workers or fewer, due to innovative technology or organization. Demographic trends, without a doubt, are putting the first factor in danger, with population growth and labor-force participation both in long-term decline.

The average number of children per woman in the United States has dropped from 3.5 in the 1950s to the near-replacement level of 2.1 today. Meanwhile, labor-force participation—the percentage of the working-age population in the workforce—is expected to decline from 67.1 percent in 2000 to 62.5 percent by 2020 as a result of the retirement of the Baby Boomers (even without factoring in lingering unemployment from the Great Recession). As population growth slows and labor-force participation declines, the growth rate of the US labor force is expected to fall, from an average of 0.8 percent from 2000 to 2010 to an average of 0.7 percent from 2010 to 2020—and perhaps as low as 0.4 percent from 2020 to 2050.

The power of governments to increase labor-force growth is limited—pro-natal policies to increase national birthrates have failed or shown limited results in other countries that have tried them, and expanding opportunities for legal immigration often encounters political resistance. When it comes to the other part of the growth equation—productivity—policymakers have somewhat more leeway. The productivity growth rate depends chiefly on technological innovation and diffusion, both of which can be boosted by investment from the public sector, private venture capital, nonprofit funding and individual genius and entrepreneurship.

If the United States is to reverse, or at least mitigate, slowing growth, productivity might be our best hope. The contribution of labor-force growth to GDP growth has plummeted, from 46 percent in the 1960s to less than 20 percent beginning in the 2000s, according to the McKinsey Global Institute—which means productivity, in comparison, now accounts for 80 percent of that growth. In other words, growth of the American economy in the future will depend not on adding masses of people to the workforce but almost entirely on improvements in how much we can produce and how quickly.

Economists like Gordon and Cowen are pessimistic about growth in part because they see the age of innovation as all but over. Gordon, for one, suggests that “the rapid progress made over the past 250 years could well turn out to be a unique episode in human history,” arguing that tomorrow’s scientists and engineers are unlikely to devise new technologies as economically revolutionary as those of the last century and a half—think the steam engine, the automobile engine and electricity. In his 2011 book The Great Stagnation, Cowen similarly identified an “innovation drought,” arguing that earlier waves of innovation have plucked the “low-hanging fruit,” making further technological progress more difficult and expensive. It was easier to invent the steam engine than to invent the jet engine, which in turn was easier than sequencing DNA. As innovation slows, Gordon and Cowen’s argument goes, so too will economic productivity and overall growth.

But who’s to say there aren’t 21st-century versions of the steam engine that are yet to come—or already here? As the economic historian Joel Mokyr has written, “Technology has not finished its work; it has barely started.” Indeed, the Austrian-American economist Joseph Schumpeter, who coined the phrase “creative destruction,” argued nearly a century ago that technological progress goes through stages, in which there is often a lag between the invention of a transformative “general-purpose technology” and the uses that are eventually found for it. Decades, for example, elapsed between James Watt’s development of the steam engine in the late 18th century and the perfection of railroads and steamboats that could open previously impassable continental interiors in the United States and elsewhere.

Today, the personal computer and the iPhone are often considered the most transformational innovations in modern information technology. But other innovations that build on these devices—for instance, the self-driving automobiles pioneered by Google and Amazon’s experiments with drone deliveries—might yet transform the way we live, work and shop even more dramatically than the desktop computer. The application of IT to manufacturing is responsible for 3-D printing and other kinds of advanced, do-it-yourself manufacturing that have vast, and still uncertain, implications for society, government and the economy.

Meanwhile, a fourth industrial revolution may be in its gestation—poised to yield a whole new set of innovations like nothing we’ve seen before. In laboratories and factories around the world, researchers are experimenting with new materials, molecular-level assembly and biotech—including “in vitro meat,” food grown in labs from stem cells. Because many of these technologies are not ready for primetime, overly optimistic investors have been disappointed so far in their limited availability. But similar complaints about the slow pace of computerization were heard in the 1970s and 1980s—right before Bill Gates and Steve Jobs came along and brought us the personal computing revolution.

There is a growing recognition that technological progress can take place only in a healthy “innovation ecosystem” in which government and universities, big corporations and small start-ups, all play vital and complementary roles. In the debate about fostering technological breakthroughs, as in the debate about whether innovation is suffering from a drought, the contending camps do not divide along conventional political left-right lines. In the past generation, neoliberal Democrats have tended to agree with many conservative Republicans that the best way for public policy to foster innovation is to stick to the basics, including free markets and moderate government investment in basic R&D, public education and infrastructure.

Others, including the venture capitalist William Janeway and the economist Mariana Mazzucato, hold that such policies are necessary but not sufficient to promote long-term economic growth. They rightly argue that government can play a crucial role in helping new industries scale up by acting as their first major customer, the way the US military did for much of the early computer industry, from mainframes to the ancestors of the Internet.

Despite headline-grabbing failures (see: Solyndra), even much-vilified forms of government investments that supposedly “pick winners” by backing particular technologies or companies have enjoyed successes. Recent advances in hydraulic fracturing or “fracking” technology have permitted access to previously unreachable natural gas and oil resources, revolutionizing the American and global energy picture over the past decade. Many have pointed to this revolution as evidence that “the private sector, in response to market forces, is better than government bureaucrats at picking technological winners,” as the Breakthrough Institute’s Michael Shellenberger and Ted Nordhaus have observed. But the duo notes that “the breakthroughs that revolutionized the natural gas industry … were made possible by the government agencies that critics insist are incapable of investing wisely in new technology”—agencies including the U.S. Energy Department and the publicly funded Gas Research Institute.

The good news is that any productivity growth rate above zero will double and redouble society’s wealth over time—if only at a slower pace. As the economist Dean Baker has calculated, even if productivity were to fall to a 1.5 percent growth rate annually, lower than the 2.2 percent rate of the last 60 years, the standard of living in the United States will improve by 40.8 percent between now and 2035—even as the ratio of workers to retirees plummets from 3-to-1 to 2-to-1. Baker reckons that the gains from productivity growth would be “more than five times the size of the negative impact from demographics.” So unless scientific and technological progress has come to an end, which seems unlikely, the standard of living in the United States should continue to improve, even if overall GDP grows at a slower rate.

That’s no reason to be complacent. In order for a much smaller number of workers to be able to support a much larger population of retirees in comfort, while being much better off themselves, the gains from continued growth will need to be distributed widely, rather than flowing to a tiny elite—which is by no means guaranteed. What’s more, technology-driven productivity growth is not spontaneous or automatic—even the techno-optimists would agree with that. This makes it all the more important to promote collaboration among government, business and universities to come up with innovations that boost productivity. American growth isn’t over. But it won’t continue all on its own either.

Michael Lind is policy director of the Economic Growth Program at the New America Foundation and author of Land of Promise: An Economic History of the United States. He is also contributing editor at Politico, where this article first appeared.