Ecomodernism and the Anthropocene

Humanity As a Force for Good


No sooner had two earth scientists declared that humans had so significantly altered the planet that a new geologic epoch was needed before a firestorm of debate ensued, both within and outside the scientific community. If the Anthropocene, or age of humans, signified a new era, what ought to be our posture toward the future? While some have described the Anthropocene as a world of hurt, others, including the concept’s main popularizers, have imagined the possibility of a good Anthropocene. Writing on efforts to translate these positive visions into reality, the authors in the fifth issue of Breakthrough Journal construct a future of accelerated decoupling, a view of nature as local and constantly changing, an embrace of complexity and pragmatism, and a role for the collective and the individual.

Summer 2015 | Ted Nordhaus, Michael Shellenberger, and Jenna Mukuno,

Sometime next year, the International Commission on Stratigraphy (ICS) may or may not decide that humans have changed the Earth so significantly that we have entered a new geologic epoch, the Anthropocene, or age of humans.1 The idea that humans have created a qualitatively different planet from the one we inherited was discussed at the beginning of the 20th century, but the informal use of the term dates back to the 1980s and ‘90s.2 In 2000, Paul Crutzen and biologist Eugene Stoermer formally proposed renaming the current geologic epoch, arguing that the Anthropocene began with the Industrial Revolution, when the increased use of fossil fuels began the process of anthropogenic global warming3 –– a view that was echoed by other prominent earth scientists4 and promoted by environmental journalists.5

But no sooner had they done so before other scientists pointed out that humans had significantly altered landscapes and the climate for as much as 5,000 years. “Does it really make sense to define the start of a human-dominated era millennia after most forests in arable regions had been cut for agriculture, most rice paddies had been irrigated, and carbon dioxide and methane concentrations had been rising because of agricultural and industrial emissions?” a group of scientists wrote in Science.6

Recently, the number of proposed start dates for the Anthropocene has proliferated. Some scientists endorse the first nuclear test in 1945.7 Others point to 1610, a date associated with the European colonization of the Americas.8 (The exchange mixed species and diseases for the first time globally, and the deaths of tens of millions of indigenous people from European diseases resulted in forest regrowth and a dip in net carbon emissions.) Other start dates point to plastics, subway systems, and the human altering of soils for thousands of years.

The disagreements are such that no lasting agreement appears likely to come easily or quickly, in part because there is no clear line. To be sure, anthropogenic climate change and biodiversity losses have increased dramatically in the last several centuries. But those developments represent an acceleration of trends going back hundreds and even thousands of years earlier, not the starting point of a new epoch. “Under these circumstances,” concluded another duo, “it is questionable whether efforts to establish a single date for the start of the Anthropocene can have any meaning or value."9 It is notable that the scientific debate over naming the Quaternary, the period that includes the Holocene, lasted for 60 years.


While some have described the Anthropocene in wholly negative terms, many others, including the concept’s main popularizers, have imagined the possibility of a good Anthropocene. Crutzen argued that the Anthropocene could be a time of “vastly improved technology and management, wise use of Earth's resources, control of human and domestic animal population, and overall careful manipulation and restoration of the natural environment.”10 Other members of the ICS’s 37-member Anthropocene working group share Crutzen’s hope for technology and human development. They speak of “Anthropocene bright spots,” “planetary opportunities,” and “a good Anthropocene.”11

Whether or not a new geologic epoch is formally declared, or a start date set, what matters most are efforts to translate these positive visions into reality. It is for this reason that the fifth issue of Breakthrough Journal is more focused on what we mean by “good” than what we mean by “Anthropocene.” Answers to that question are offered in “An Ecomodernist Manifesto,” published by a group of 18 environmental scientists, scholars, and campaigners, including the two of us last April, and reprinted here. This essay, which generated headlines in newspapers around the world and a growing number of responses, signified a turning point. While Breakthrough Journal will always be home for intellectual criticism, with this issue our emphasis shifts from what environmentalism is not and cannot be, to what ecomodernism is and should become.

The ecomodernist manifesto affirms the traditional environmentalist view that human societies should shrink their impact to leave more of earth for nature, but rejects the idea that humans should attempt to harmonize with natural processes. Instead, the manifesto argues, the goal should be to increasingly "decouple" human development from natural resource use and environmental impacts. Intensifying many human activities –– particularly farming, energy extraction, forestry, and settlement –– so that they use less land and interfere less with the natural world is key to decoupling human development from environmental impacts.

The concept of decoupling owes much to environmental scientist Jesse Ausubel, whose essay in this issue, “The Return of Nature,” offers an empirical look at how decoupling is happening in the United States. “Our economy no longer advances in tandem with exploitation of land, forests, water, and minerals,” Ausubel writes. “American use of almost everything except information seems to be peaking. This is not because the resources are exhausted, but because consumers have changed consumption, and because producers changed production.”

This is not just a vision for the United States. Ausubel argues that decoupling can be global –– if we take conscious actions to decouple. “If we keep lifting average yields … stop feeding corn [as ethanol] to cars, restrain our diets lightly, and reduce waste, then an area the size of India or of the United States east of the Mississippi could be released globally from agriculture over the next 50 years or so.”

Technological decoupling of consumption from environmental impacts is aided and abetted by slowing growth and the changing nature of consumption in many parts of the world, Fred Block argues in “The Stagnation Illusion.” In the wake of the Global Financial Crisis of 2008, many economists have become concerned at the prospect of “secular stagnation,” the notion that growth rates in advanced developed economies may be permanently reduced. But Block argues that this phenomenon is a feature of wealthy economies, not a bug.

Low growth rates are in large measure a function of a larger economy. “A two percent growth rate for a $10 trillion economy produces the same annual increment in output as a four percent growth rate for a $5 trillion economy,” Block notes. At the same time, after material needs are largely met, people take their wealth in less material ways that are not so easily measured, such as increased leisure time and higher environmental quality. Achieving continued progress will require new economic thinking focused not just on “abstract units of production and consumption” but also on “companionship, good health, intellectual stimulation, aesthetic experience, and natural beauty.”


If ecomodernism is to have a science and an economics it must also have a spirituality, argues environmental philosopher Mark Sagoff in “A Theology for Ecomodernism.” Sagoff argues that ecomodernists should reject the monotheistic view of nature that environmentalists borrow from Judeo-Christianity. Pre-Christian religions viewed nature as local and assigned gods and spirits to trees, the wind, the harvest, and the sun, Sagoff notes. But “when God became One, nature became one …. That nature expresses one God and is therefore unitary, singular, and at odds with man, who consumes and corrupts it, became a familiar trope of conservationism.” Sagoff concludes, “The theological hope of ecomodernism is that we can understand nature to be many, many places, each with its own guardian spirit. The hope is that human beings will become the guardian spirits of the natural world.”

If humans are to succeed at becoming effective guardian spirits, then their efforts at “rewilding” landscapes liberated from the human economy must become more pragmatic. In “Rewilding Pragmatism” geographer Martin Lewis draws on the messy successes of Kruger National Park in South Africa. Kruger is crisscrossed by roads and cluttered with middlebrow accommodations. But its populism, its growing size, and its rebounding populations of elephants and other apex predators have led scientists to conclude that Kruger’s pragmatic model –– characterized by positive relations with park neighbors –– is superior to fussier rewilding efforts elsewhere. “We don’t have to choose between the ‘wilderness’ of the traditional green imagination and the ‘domesticated garden’ that is the supposed desideratum of the new school,” Lewis concludes.

Such pragmatic rewilding might be aided by understanding that species and landscapes were in a state of constant change long before humans showed up. “As a young ecologist, I was trained to see introduced species as a kind of pollution no different from dioxin toxicity or aluminum poisoning,” Joseph Mascaro writes in his essay “Earth Makers.” Out in the field, however, Mascaro discovered that ecosystems are always-already “novel ecosystems” with newly arrived “invasives” mixed up with long-established “natives.” As a consequence, the past can inform but not determine the future. Forward-facing ecologists and ecomodernists, Mascaro argues, “should encourage resilient ecosystems that maintain high diversity and robust functioning through a range of environmental conditions –– regardless of the historical ranges of the species involved.”


While fear and negativity still pervade many discussions of the future, positive events over the past 20 years have resurrected a hopeful vision of progress. The horrors of World War II and the nuclear arms race led to the rhetoric of doom that animated not only the environmental movement, but also the broader political Left starting in the 1960s. Resource scarcity and pollution were no longer problems to be solved through technology but rather signs of end times and cause for going back to a time when humans lived in greater harmony with nature. But since the 1980s, as fertility levels have fallen, global food production has outstripped rising population, extreme poverty has been cut in half, and deaths from war have declined, it has been increasingly hard to make the claim that progress is an illusion.

Still, the preference for apocalyptic framing of what are actually chronic features of modern life persists on the Left. In “Fear and Time,” Will Boisvert argues against antiquated and overly simplistic models such as the Doomsday Clock created by the Bulletin of the Atomic Scientists. The problem isn’t merely that the clock is outmoded in its claim that we are living in more dangerous times than during the Cuban Missile Crisis, a period most experts view as the closest humans ever came to nuclear war. The problem is that focusing on low probability catastrophic risks draws our attention away from addressing the more prosaic risks that continue to plague modern societies.

Catastrophic climate scenarios fail to recognize the ways in which climate change is a consequence of developments that have made humans more resilient to the climate, while often downplaying or even dismissing the slow, laborious work of decarbonizing economies. Summoning exaggerated fears of nuclear accidents makes it harder to address more prosaic risks like climate change and air pollution. Fearmongering about the risks of genetically modified flu viruses getting out of the laboratory threaten critical research to better understand how dangerous viruses are likely to evolve through natural selection and how best to counter them.

“While all technology has destructive potential,” Boisvert writes, “the net effect of technological development has been to make life vastly safer and healthier. Using exaggerated doomsday speculations as an excuse to impede it can therefore be both perverse and counterproductive.”

Sustaining progress to achieve better futures will require that liberalism once again embrace collaboration and compromise. Shared investment in social welfare, technology, and infrastructure depend upon a social contract between the public, government, and industry. In “Raiding Progress,” historian Michael Lind argues that it was the public interest movements of the 1960s, not corporate mendacity, that shattered the postwar liberal consensus that made that era’s high growth, low inequality, and improving environmental quality possible.

Prior to the 1960s, American liberalism accepted that “public policy emerged from negotiations among economic interests groups, rather than from all-wise, altruistic progressive experts.” Labor unions, corporations, and consumer groups sought regulations aimed at helping industries flourish.

All of that changed with the rise of the environmental and other “public interest” movements, which attacked corporate interests as illegitimate and compromise as appeasement. The Left’s loss of working-class voters and productive industries was the result. “Whether the center-Left can aspire to repair its relations with productive industry and much of working-class and middle-class America depends on whether it can renounce the dream of the crusade and rediscover the lost art of the deal,” argues Lind.

A positive vision of the future, accelerated decoupling, a view of nature as local and constantly changing, an embrace of complexity and pragmatism, and a role for the collective and the individual –– these are the ingredients of a positive future, whether or not we call it the Anthropocene. Humans became world makers through a long process of sociocultural evolution.12 We have remade the world many times in the past and will likely do so many times in the future. The important question now is not when the Anthropocene began, but rather how will it proceed. /


Photo Credit: Shutterstock


1. “Subcommission on Quaternary Stratigraphy,” last modified May 5, 2015, accessed June 5, 2015,

2. James Syvitski, “Anthropocene: an Epoch of Our Making,” Global Change 8, no. 78 (2012), 12-15.

3. Paul Crutzen and Eugene Stoermer, “The ‘Anthropocene,’” Global Change Newsletter 41 (2000), 17-18.

4. Will Steffen, et al., “The Anthropocene: Conceptual and Historical Perspectives,” Philosophical Transactions of the Royal Society 369, 842-867.

5. Andrew Revkin, “Confronting the Anthropocene,” Dot Earth (blog), New York Times, May 11, 2011, 

6. William F. Ruddiman, et al., “Defining the Epoch We Live In,” Science 348, no. 6230 (2015), 38-39.

7. Jan Zalasiewicz, et al., “When Did the Anthropocene Begin? A Mid-Twentieth Century Boundary Level is Stratigraphically Optimal,” Quaternary International (in press). 

8. Simon L. Lewis and Mark A. Maslin, “Defining the Anthropocene,” Nature 519 (2015), 171-180. 

9. S. J. Gale and P. G. Hoare, “The Stratigraphic Status of the Anthropocene,” The Holocene 22 (2012), 1491-1494.    

10. Paul J. Crutzen, “The ‘anthropocene,’” Journal de Physique IV 12, no. 10 (2002), 1-5.

11. Ruth DeFries, et al., “Planetary Opportunities: A Social Contract for Global Change Science to Contribute to a Sustainable Future,” BioScience 62, no. 6 (2012), 603-606; Andrew Revkin, “Paths to a Good Anthropocene,” Dot Earth (blog), New York Times, June 16, 2104,

12. Erle Ellis, “Ecology in an Anthropogenic Biosphere,” Ecological Monographs 85, no. 3 (in press),




by Jesse Ausubel


by Martin W. Lewis



by Mark Sagoff