Seeing the State

In Search of a New History of Economic Modernity

Eighteen months into the Trump presidency, the debate about how to fix the US economy has not really changed. Arguably, there has been greater innovation on the Right than on the Left. “Greatness conservatism” continues to promote an anti-tax and regulatory reform agenda that dates back to the Reagan era, but it has also managed to tack on protectionist trade and restrictionist immigration policies that have historically been anathema to free-market conservatives. On the Left, the songs are even more dated. Proposals for single-payer health care, a government job guarantee, and increased spending to rebuild our failing transportation systems date back to the 1930s.

Meanwhile, a lively discussion ensues, largely on the West Coast, of the impending arrival of robots and artificial intelligence. For some libertarians and techno-optimistic progressives, the fear of ever more massive job losses has led to renewed interest in the idea of a universal basic income to cover the costs of food, shelter, and other necessities. While such proposals are radically different from the job-centered vision of progressive Democrats who want the government to serve as employer of last resort, both factions have in common a focus on distribution rather than production. Each suggests ways to divvy up GDP more fairly, but unlike greatness conservatives, neither has much to say about accelerating the growth of GDP.

History, however, suggests the riskiness of this strategy. The period from the 1940s to the 1970s, when the United States had a far more equitable distribution of income and wealth, was a period of strong economic growth in this country. Some portion of the growing pie made possible gains for those in the bottom half of the income distribution. But when growth comes more slowly, efforts to redistribute are bound to be met with fiercer resistance. This has been the story of the past forty years — rich reactionaries successfully fighting for an ever-larger share of the slowly growing economic pie.

What’s worse, influential economists suggest that slow growth is here to stay. Northwestern University economist Robert Gordon’s widely discussed The Rise and Fall of American Growth (2016) provides a telling example. According to Gordon, the US economy experienced a century of dramatic growth from 1870 to 1970. After 1970, however, growth slowed, and the economy now faces increasingly strong headwinds that will likely lock in this pattern.

Gordon is as an old-fashioned Keynesian who is eloquent in his defense of FDR’s New Deal. But that was then; now, Gordon argues, there is little government can do to accelerate the development of innovative new products and processes by businesses. “The potential effects of pro-growth policies are inherently limited by the nature of the underlying problems,” he finds. “The fostering of innovation is not a promising avenue for government policy intervention, as the American innovation machine operates healthily on its own.”

This statement is puzzling, since the thrust of the second half of his book is that the performance of the American “innovation machine” has been in steady decline since 1970. Gordon resolves this tension by arguing that this slowdown is inherent in the development of technology. In short, the US innovation system is doing the best it possibly can because breakthrough technologies with economic payoffs comparable to electrification or indoor plumbing are simply no longer available. But how could he possibly know that? Gordon also ignores the abundant evidence, some of which he cites, that the government has played an extremely important role in fostering innovation in the 19th, 20th, and 21st centuries. In his analysis of the 1930s and 1940s, for example, he shows that public-sector road building was critical for the advances in productivity associated with cars and trucks.

To be sure, techno-optimists have criticized Gordon’s gloomy analysis by pointing to imminent advances in robotics, artificial intelligence, and other computer-related technologies that will accelerate economic productivity. But if robots and artificial intelligence are actually on their way, it will mean much-accelerated displacement of people from existing jobs, exacerbating the economic misery in the nation’s old industrial heartland. Even if a universal basic income were adopted, people’s purchasing power would lag far behind advances in productivity.

The reality is that neither political partisans nor policy-oriented intellectuals have a persuasive strategy for solving the interlocking problems of the US economy. The Right’s free-market solutions of massive tax cuts and regulatory rollbacks have failed repeatedly. Progressives consistently focus on a fairer distribution of resources but ignore the problem of expanding economic output. And most progressive policy intellectuals, whether techno-optimists or techno-pessimists, have nothing that resembles the elaborate game plans that right-wing think tanks handed to Ronald Reagan and George W. Bush when they entered the White House.

When so many intelligent people are unable to see a way to fix something that is clearly broken, the obvious explanation is that their intellectual tools are deeply flawed. The fundamental problem is that scholars, regardless of political orientation, have been working with a false history of how economic modernity emerged and developed in Europe and the United States. This mistaken history of capitalism has in turn hamstrung their ability to see viable paths forward.

1.

Most scholars agree that the key breakthroughs to economic modernity occurred in the Netherlands and England in the 17th and 18th centuries. These developments were then explained in the late 18th and early 19th centuries by the founders of modern economics — Adam Smith, T. R. Malthus, and David Ricardo — who established the core narratives of economic history that would be followed by subsequent scholars down to the present.

Smith, Malthus, and Ricardo were hardly disinterested observers; rather, they were committed to overthrowing what they saw as the reigning economic ideas of their time — a paradigm they called mercantilism. In the late 18th century, most societies were monarchies or principalities, and the most important economic thinkers were those who advised the crown on the best strategies for making the provinces they controlled prosper. Those advisors did not adhere to one specific policy orthodoxy; some preferred freer trade and others more tariffs. But they all believed that enlightened policies by the monarch would contribute to greater prosperity for his or her subjects. It was this approach that Smith defined as mercantilism.

Smith and the other classical economists rejected mercantilism because it put the state at the center of the economic narrative. They believed that with the expansion of commerce, the economy had become equipped for self-governance. As a result, they argued for a self-regulating market system in which supply and demand would be brought into balance through changes in prices. Government would continue to provide the legal framework in which economic activity occurred by enforcing property rights and contracts, but to ensure prosperity, the classical economists argued, the economy needed to enjoy far greater autonomy than it had under mercantilist policies.

In their campaign to defeat mercantilist ideas, the classical economists had to gloss over some historical realities. They argued that the acceleration of economic activity in England and the Netherlands in the 17th and 18th centuries resulted from merchants and entrepreneurs responding to market signals. They did not address the role the Dutch and British navies played in facilitating patterns of global trade and colonial settlement that created enormous flows of wealth to Europe. That wealth depended on the forceful extraction of human slaves from Africa, inflows of silver from Latin America, plantation agriculture across the Americas, and patterns of exploitative trade across the non-European world. These coercive practices all depended in turn on European military superiority. But these facts did not mesh well with the binary division between state and market that the classical economists had erected — for them, almost everything positive came from market activity and everything negative came from the state. As a result, they placed the rising class of merchants and industrialists at the center of their story of economic modernity and pushed the state to the margins of history.

The views of the classical economists were contested at the time, but many of those criticisms were quickly forgotten. The one critique that endured was elaborated by Karl Marx, Friedrich Engels, and their followers, who did not hesitate to highlight the ways that economic modernity in Europe had depended on the brutal exploitation of people outside Europe organized through state-led violence. They also emphasized the state’s role in protecting the power of the bourgeoisie and in repressing anyone who challenged that power.

Marx and Engels could have elaborated a more balanced account of economic modernity that recognized the ways that actions by the state and actions by market actors were intertwined and interdependent. But they were single-mindedly opposed to the idea that the existing order could be successfully reformed through governmental action. Hopeful that the working class would rise against the bourgeoisie, they denounced as futile any efforts to capture or reform the state. Instead, they insisted, once bourgeois class power was overthrown in a socialist revolution, the path would be clear to a self-governing society in which coercive state power no longer held sway.

This notion led them to insist that what appeared to be a powerful and autonomous state was nothing more than a reflection of the class power of the bourgeoisie. And so, while Marxists acknowledged the centrality of state action in the rise of capitalism, they too pushed the state out of the picture. Much like the classical economists, Marxists assured their followers that when it came to the state, there was nothing to see.

In so doing, the Marxists and the classical economists each set the terms of a debate that has shaped almost all subsequent historical accounts of economic modernity in England and the United States. Some of those accounts lean more in the direction of the classical economists, others lean more in the direction of the Marxists, and still others try to incorporate insights from both sides. But the result is that one economic history after another systematically ignores the foundational role of states in creating modern market economies.

The impact of this historical fallacy is visible in the way that economists of all political stripes think and talk about the government’s role in the economy. Today, almost all mainstream economists believe that government intervention is justified in circumstances of market failure — when markets that are left to their own devices produce less than optimal results. Classic examples of market failure include firms dumping toxic waste into rivers and poor people not being able to afford education or health care. Indeed, economists of the Left and Right agree that government action is needed when markets fail; they simply disagree over how frequently such failures occur.

What the market-failure concept assumes is that under normal conditions, autonomous markets operate successfully. But without government, markets could not even exist, much less meet any important human needs. Stable markets need a legal framework in which to establish property rights, create enforceable contracts, and protect against fraud and predation. The weights and measures used in transactions are standardized by the state, and it is the state that ensures a supply of money and credit that makes market transactions possible. In short, the idea that markets might operate free of government is a fantasy rooted in a false history of economic modernity — one that has been told since the start of the 19th century.

Correcting these misunderstandings will require a whole set of new historical studies to recount the state’s central role in economic modernity. But even a cursory look at three arenas in which the state has played, and continues to play, an absolutely critical role offers an instructive lesson. These are the provision of money and credit, the building of infrastructure, and the fostering of innovative new products and processes.

2.

Obscuring the state’s central role in providing the money and credit required for a market economy has required the propagation of a number of dubious theories. The commodity theory of money, for one, claims that money has value because of its links to commodities that are universally seen as valuable, such as gold and silver. In this account, economic activity began with barter, as individuals traded one commodity for another. After a while, these individuals came to agree that one master commodity could be used to purchase all others. As governments gained control over territories, they minted coins made of one or another of these precious metals. When paper currency was first issued, people accepted it as valuable because it could be traded for gold or silver. Eventually, they continued to accept government-issued currencies even when this convertibility ceased.

The reality, however, is that money as a means of exchange has always depended on the backing of governments. It is the governmental seal of approval that gives money its value, and it is ongoing government action that stabilizes the purchasing power of a currency and averts the dangers of deflation and inflation. In short, the basic fuel of a market economy — the money that allows people to carry out transactions — is a government creation.

To be sure, the government that backs a currency in circulation might not exercise sovereignty over that particular territory. Today, for example, the US dollar is the main currency in countries such as Ecuador, El Salvador, and Zimbabwe. Moreover, many governments have waged a long struggle to establish their particular currency as sovereign over their entire territory. But it is also the case that the most successful market economies are located in those places where government was ultimately successful in establishing monetary sovereignty.

A similar erroneous explanation exists for “private” credit creation within market economies. Because having an adequate supply of currency in circulation does not solve the problem of financing new investment, modern economies require institutions that can provide credit to investors. The conventional claim is that banks are able to provide this credit because they are financial intermediaries that connect savers with borrowers. Since bankers play this critical role of mediating between these two sets of actors, it follows that the government should do as little as possible to interfere with this vital service to society.

But the reality is that banks and other financial institutions are not intermediaries; they create credit essentially out of thin air. When a bank issues a loan of a million dollars to a business customer, it simply adds that amount as a deposit to the customer’s account. This arrangement ultimately works because the government allocates the right to create credit to a limited number of institutions that agree to obey certain rules. Ultimately, the government underwrites this process of bank lending through the central bank’s role as a lender of last resort. As we saw in the global financial crisis, even when financial institutions extend credit recklessly, the government still steps in to back their creation of credit.

The government, in short, is the ultimate source of the entire supply of money and credit that fuels a market economy. Because it outsources the process of credit creation to banks and other financial institutions, the government is, in effect, a franchiser, and those institutions franchisees. Nonetheless, the system could not work without the government’s ongoing support and restraint of its franchisees.

Thus, this central aspect of a market economy, usually seen as private and operating only in response to market signals, is in fact heavily dependent on the government, undermining the central claim of market fundamentalists that the economy exists in a realm separate from government. This perceived autonomy is utterly an illusion: without the public sector managing the supply of money and credit, there is no market economy.

3.

Investing or failing to invest in infrastructure has become an important topic of current political debate, but the concept of infrastructure is a relatively recent invention. Barely any mention of infrastructure surfaces in a search of Google Books prior to 1960, and the term’s usage takes off only after 1980. This historical silence is revealing: advances in sanitation, communication, and transportation have long been central to economic modernity. Yet in the economic history literature, investment almost always refers to private investments in factories, warehouses, stores, and office complexes. Investments in an elaborate urban system of underground pipes and conduits to manage sewage, water, gas, telecommunications, electricity, and sometimes steam, meanwhile, are usually ignored.

Urbanization is one of the core features of economic modernity. Large populations make possible the rise of specialized trades and specialized industries. Industrial districts such as New York City’s garment district and Silicon Valley are indispensable for dynamism and innovation because they attract multiple firms and a uniquely skilled group of employees. Even today, with sophisticated transportation and communications, most cities around the world are actively working to create new high-technology industrial districts as a key strategy for 21st-century competitiveness.

But the movement of populations to ever-larger cities could not have occurred without the advances in sanitation that took place in the 19th and early 20th centuries. Without clean water, sewer systems, and waste treatment, urban growth would have been cut short by periodic epidemics of infectious diseases. Nor was it entrepreneurs or businesspeople who developed and financed these amenities, but rather government officials. The same point could be made about building codes, fire departments, and fire-fighting technologies.

Similarly, from the 18th century onward, advances in transportation have required investments in infrastructure that included building lighthouses, dredging harbors, digging canals, developing railroad lines, paving roads, engineering bridges and tunnels, creating mass transit systems, and eventually, building and expanding airports. In the first half of the 19th century, many canal and railroad projects were initially organized by private entrepreneurs, but in a surprisingly high percentage of cases, they received some financial backing from either state governments or the federal government. The critical link in the national rail system was the building of the intercontinental rail lines that were financed through federal land grants to railroad developers. In virtually every other case, transportation infrastructure has been provided directly by government authorities, sometimes doing the work themselves and sometimes hiring private contractors.

A similar story can be told about communications technologies. Samuel Morse persuaded Congress to finance the first experimental telegraph line from Baltimore to Washington that transmitted the historic message “What hath God wrought?” With the Pacific Telegraph Act of 1860, Congress provided an annual subsidy for ten years to a firm that would build the first transcontinental telegraph line linking California to the East Coast. In the case of the telephone, the government incentivized AT&T to build out a national network by providing them with a monopoly on long-distance phone service. It was also the government that helped put satellites into orbit to relay long-distance calls around the world. More recently, it was the Advanced Research Projects Agency at the US Department of Defense that created the initial versions of the Internet, including many of the protocols still in use today.

Two of the crowning achievements of economic modernity have been large, well-functioning cities and a world connected by ever-faster travel and modes of communication. Undoubtedly these are achievements in which the government played an absolutely critical role. Private entrepreneurs simply could not have afforded to make the investments in infrastructure required to facilitate these advances.

4.

Much has been written, in these pages and elsewhere, about the importance of the government’s role in facilitating key technological innovations in the postwar era. From the rise of fracking as a technology for greatly expanding the production of both petroleum and natural gas to the development of the microchip, government research has played a key part. Indeed, my research and that of many others has amassed overwhelming evidence that the US government played a central role in virtually all of the key technological innovations since World War II.

But the government’s centrality to innovation long predates the current period; it even predates the emergence of market societies. Throughout human history, warfare has been the great driver of technological innovation. In search of military advantage, rulers have gathered together groups of people with technical skills and invested the needed resources required to turn ideas into devices that could lend their armed forces an edge. In many cases, such ideas were already in circulation, but it was only the urgency or threat of war that justified the substantial investments required to transform them into something usable. And in some cases, such as advances in transportation and communications technologies, once the breakthrough occurred, it diffused throughout the civilian economy.

One example from US history illustrates how conventional accounts often distort the historical record. Many of us were taught in school that Eli Whitney, the inventor of the cotton gin, also pioneered manufacturing with interchangeable parts. That was a critical technological turning point because the standardization of machine-made parts facilitated mass production. However, although Whitney did receive a contract from the government to mass-produce rifles made with interchangeable parts, he himself was unable to deliver the goods. The successful breakthrough actually occurred at the Springfield Armory, a government-run facility. Quickly copied by the machine tool industry, this government-sponsored innovation then laid the foundation for the mass production of bicycles and eventually automobiles.

The same pattern continued into the 20th century. World War I drove significant advances in airplane and radio technology, and World War II gave us atomic power, radar, the first electronic computers, and antibiotics. The latter case is particularly relevant. Alexander Fleming did identify the bacteria-fighting capacity of a particular mold known as penicillin in 1928. But no firm was willing to make the substantial investment required to transform that insight into an actual medication. It was only after Pearl Harbor that the US government funded an accelerated effort to mass-produce the drug.

One important change of recent decades is that the government now makes investments in bringing new technologies to market even when no promise of military advantage exists. The same historical model of the government providing substantial funding to teams of talented scientists and engineers has brought us energy-efficient light bulbs, cheap solar panels, and more-effective wind turbines.

The key inventions of the past two hundred years cannot be attributed to ingenious tinkering by inspired engineers or concerted initiatives by corporate laboratories. Yes, the Wright brothers built the first successful airplane, but the rise of the airplane industry depended on government research and development and government purchases for World War I. Similarly, Shockley’s team at Bell Labs invented the transistor, but it was government demand that created the semiconductor industry and made possible the era of computerization we are living in today.

5.

At the core of these false histories of economic modernity is a view of the market economy as an extension of nature. In this view, firms compete with each other, just as species do, and sometimes come up with adaptations that are particularly productive and reshape the direction of economic development. In this competitive process, some firms will die, but that is what Schumpeter characterized as “creative destruction” — it sets the stage for new breakthroughs.

This logic of naturalization has also been consistently deployed to question the government’s attempts to manage or regulate the economy. If the market economy is natural, then government is by definition an arbitrary contrivance whose efforts are bound to disrupt the economy’s natural rhythms.

Marx and Engels, of course, sought to challenge this effort to depict capitalism as the natural state of humanity. But by explaining all of the dynamism of capitalism as a product of a single class — the bourgeoisie — they slid into their own version of naturalism. They too failed to recognize economic modernity as something constantly constructed and reconstructed at the intersection of states and markets.

At the end of the day, markets are completely reliant on government to operate effectively. This insight does not suggest that governments should dictate prices or impose crippling regulations on business. Government actions can be more or less effective in achieving such goals as market stability, more-efficient use of resources, and more-equitable outcomes. There is ample room for debate and disagreement over how regulations should be structured and what investments governments should make. But those discussions are effectively derailed when the debate is distorted by naturalization and the fantasy of self-regulating markets.

Nor do we have to worry that a government-led economic development strategy requires ceding power to giant Washington bureaucracies. On the contrary, it is possible to design programs that use federal resources to enhance political and economic capacities at the local level. The New Deal’s strategy for rural electrification provides a classic example in which communities were encouraged to create local cooperatives that would tap into the national electrical grid and deliver power to remote homes and farms. The government, in turn, lent these co-ops the money to finance this undertaking — one that drove massive gains in rural well-being.

The kind of economic development strategy we need today would rest on two pillars: massive investments in infrastructure and accelerated efforts to roll out innovations that could generate equitable growth. The infrastructure spending might focus on affordable housing, clean energy, energy conservation, and universal access to broadband and cellular service, as well as increasing the resilience of communities threatened by climate change. It could be financed by a variety of decentralized mechanisms that would give local communities greater influence over these decisions.

Such infrastructure investments would also help accelerate the diffusion of innovations, such as the technologies required for a smart, energy-conserving electrical grid. But it will be equally important to shift government innovation programs away from military technologies and such questionable innovations as autonomous vehicles, which threaten to further erode jobs and provide new opportunities for terrorists to steer vehicles into crowds of pedestrians. Instead, we might look to integrate ride-sharing technologies with urban transit systems in a way that creates both decent-paying jobs and new modes of getting people from one place to another that are more efficient and equitable.

The point is to recognize that the great economic gains of the past several hundred years were made possible by the entrepreneurial activity of government in investing in innovation and infrastructure and in providing the structures of money and finance required to support productive private investments. Once that is understood, it becomes possible to create a serious reform agenda that uses these governmental powers to move us toward an economy that is environmentally sustainable, more egalitarian, and not dominated by the power of entrenched corporations. Until we make these critical state capacities visible, however, we will only continue to rehearse old arguments, overlooking the drivers of long-term growth and shared prosperity that make for a better, more equitable world.

Read more from Breakthrough Journal, No. 9

Featuring pieces by Rachel Laudan, Alan Levinovitz,
R. David Simpson, Mark Sagoff, Julie Guthman,
Brandon Keim, and more.