In the Anthropocene…
The faulty search for precision at the heart of a vague concept
Remember 2006, when Pluto was demoted to a “dwarf planet” by the International Astronomical Union? Despite widespread public outcry, Pluto’s planetary status remains diminished to this day. As settled as we believe some science to be, scientific facts can be changed—even without new evidence—merely because a small committee of experts draws a line.
Now, a group of experts—the Anthropocene Working Group (AWG)—has done just that, declaring their own formal definition of the “Anthropocene” as an epoch in the Geologic Time Scale. This isn’t just a matter of academic interest; their work amounts to trying to nail spaghetti to the wall, with ramifications for the way we understand our place in the universe and the earthly decisions that follow.
What makes an epoch?
The concept of the Anthropocene has been kicked around since long before the AWG started its work. One could say it began in 1797 with Comte de Buffon, who envisioned that “the power of man” would assist “that of nature.” Much later, in 1992, New York Times journalist Andy Revkin coined the forerunner to the modern term by suggesting we were in the “Anthrocene.” Also in 1992, Paul Crutzen, a Nobel Prize-winning chemist, generated the more familiar “Anthropocene” on the spot at a conference in an attempt to emphasize the human impact of our earthly behaviors.
Once coined, “the Anthropocene” became simultaneously more vague and more precise.
On one hand, the term was enthusiastically adopted by the humanities, social sciences, and the general population. Academic papers, books, conferences, museum exhibits, and entire issues of literary journals have been centered around the phrase “in the Anthropocene,” as if we have a shared understanding of what that denotes. It apparently bestows significance on topics ranging from saving nature (see: “Saving nature in the Anthropocene,” by Professor of Environmental Studies James Proctor) to finding your enemies (“Telling friends from foes in the time of the Anthropocene,” by philosopher Bruno Latour) to the future (“Human destiny in the Anthropocene,” by ethicist Clive Hamilton) to cows (“Cattle in the Anthropocene,” by scholars Andrew McGregor and Donna Houston) and even to death (“Learning to die in the Anthropocene,” by war veteran Roy Scranton). As Orion magazine has noted, there are at least 40 versions of the concept.
Academia’s love affair with the Anthropocene has even spawned a wild array of alternative terms: “Capitalocene” (the problem is capitalism), “Manthropocene” (the problem is men), and the difficult-to-pronounce “Chthulucene” (defined by author Donna Haraway as the “ongoing multispecies stories and practices of becoming-with in times that remain at stake, in precarious times, in which the world is not finished and the sky has not fallen”).
Outside of all this brainstorming, on the other hand, we have the AWG—a small group of geologists and stratigraphers tasked with bringing precision to the cloud of ideas by putting a quantifiable marker on when the Anthropocene began. At its inception, the group included a mix of geologists, environmental scientists, and others with related expertise, even including Crutzen himself.
In scientific terms alone, their task was huge thanks to the complexities of collecting the type of historical geological data that would be needed to make a determination. But the job was perhaps even more daunting in moral terms. What assumptions are embedded in this word? If it’s both a concrete moment in history, decided upon by a small group of people, and a term that the humanities can reinvent over and over again … what we talking about? And what are the implications?
Many in academia, for example, use the Anthropocene as a way of encapsulating assumptions about the human-environment relationship or to rehash old debates about that relationship, especially over our ideas about what nature is and what it is not. While some argue that the concept of the Anthropocene represents the travesty of humans fundamentally altering a pure nature, others have argued that there was never a pure nature to alter. In other disciplines, the term is a way of celebrating human progress and technological progress.
Either way, “the Anthropocene” is deeply resonant. It’s a provocative, generative idea, focusing the mind on the extent of our influence and agency—are we pilots or passengers on this planet?—including the ways technology and culture mediate our relationship with nature, the kind of world we want for future generations, and what it means to be human.
ChatGPT HAS ENTERED THE CONVERSATION...
In common environmentalist discussion, “Anthropocene” refers to a proposed geological epoch that marks the impact that human activities have had on the earth’s environment and ecosystems. The term was first coined in the late 20th century to describe the profound changes that human activities, such as burning fossil fuels, deforestation, and the use of chemicals, have had on the planet.
According to the Anthropocene hypothesis, the impact of human activities on the earth’s environment and ecosystems has been so significant that it has caused a shift in the planet’s natural systems and a new epoch should be recognized in the geological time scale. Environmentalists view the Anthropocene as a wake-up call for humanity, highlighting the need to address the environmental challenges that we face, such as climate change, loss of biodiversity, and pollution.
In environmentalist discussions, the Anthropocene is often used to emphasize the need for immediate action to reduce greenhouse gas emissions, curb the impacts of climate change, and protect the natural environment. Environmentalists believe that the impacts of the Anthropocene, such as rising temperatures, increasing frequency and intensity of extreme weather events, melting of glaciers, and sea level rise, pose significant risks to human health, food security, and the stability of societies around the world.
Welcome to the new “Crawfordian Anthropocene”
The effort to formally define “Anthropocene” started in 2009, when the AWG was formed under the leadership of Jan Zalasiewicz by the keepers of the geological time scale (GTS) (more specifically, under the tortuously named Subcommission on Quaternary Stratigraphy of the International Commission on Stratigraphy) with the express mission of defining precisely when the epoch we are now in began—and, to a lesser extent, where it began. (An epoch is always narrowly defined in terms of its temporality, with its spatiality mostly there to validate the general theory.)
From the AWG’s beginning, some members of the body didn’t even believe that calling the Anthropocene an epoch was appropriate. Rather, they said, it should be described as an “event” because the human impacts on the environment are simply too variegated, in terms of time, space, and impact. Calling it an “event,” they argued, would better encompass the sheer diversity and complexity of the human-environment relationship. Even so, the idea of an epoch won out, and in 2019, after a decade of research, 29 of the group’s 34 members voted that the mid-20th century would be marked as the epoch’s start to recognize rapid global changes around that time that are otherwise known as the “Great Acceleration.”
In reality, the changes of that era represent a complex and relatively gradual evolution occurring at different rates in different places across the planet. It is decidedly not the kind of instantaneous planetary change—like climate revolution from asteroid strikes— that stratigraphers generally require to mark boundaries between epochs and other major units of the GTS.
Understanding that, the AWG opted to focus on marking the Anthropocene boundary at a single selected site, where a continuous core of sediment or other deposited geological material would show a clear stratigraphic marker for the 1950s, such as plutonium and carbon-14, where the “golden spike”—that is, the physical location that best aggregates the ways in which humans became a driving force—would be defined. Voting was restricted only to stratigraphers, and markers for any time other than the mid-20th century would be off-limits.
In mid-December 2022, after investigating 12 potential sites, the stratigraphers completed their first round of voting to decide on their site—a decision that will mark the specific start of the Anthropocene. Although the voting is not yet final, it increasingly looks like they will define the Anthropocene based on a single core of sediment from Canada’s Crawford Lake, whose composition indicates that the Anthropocene epoch began in 1950.
The stratigraphic definition of Anthropocene is presented as quite precise and scientific, but scientific facts are decided by insular groups of humans, in this case funded by a cultural institution in Berlin. A different group could have come up with a totally different definition. One more interested in economics or in cultural change, for example, may well have picked a different variable to measure to mark changes from one age to another.
That’s understandable. But the problem is that this terminology—decided upon by a very small circle of individuals—has real world implications, and perhaps most importantly, doesn’t recognize the way in which the concept of the Anthropocene has already taken its own course, thereby in some respects leaving the stratigraphers on the sidelines. The new “Crawfordian Anthropocene,” despite its apparent solidity, is unlikely to get us any further than the riffs it’s meant to replace.
The power of words
Stratigraphers do not and should not own the Anthropocene. Human transformation of this planet is not simply a matter of geology. “The Anthropocene” is most helpful as a point of departure for contemplating the human-environment relationship, and a mirror through which to examine our normative assumptions.
And yet, more than many of these other actors, what the stratigraphers say will soon officially go. And now, an epoch defined by dirt in Canada will likely become the standard scientific definition of the Anthropocene for the next decade at least, as time units can’t be changed for 10 years after being decided. In practice, the idea that humans have fundamentally remade the planet is now baked in—with all the assumptions that entails.
If, in fact, the Anthropocene is narrowed to begin in 1950, what happened before that becomes out of bounds. On the ground are the stratigraphers, geologists, environmental scientists, anthropologists, and others, for whom this re-definition will change the parameters of their research. And those who draw on the science these disciplines produce, including organizations involved in assessing the challenges of the human age, will find their foundations shifting along with the data available to make certain claims.
The Breakthrough Institute, and by relation ecomodernism, has embraced the concept of the Anthropocene as an opportunity—a sign of the ways in which human agency, when corralled, can shape a better world. That means understanding and guiding how our societies are transforming the planet—carbon emissions that contribute to climate change, the material throughput of our economic consumption, land use change, and biodiversity loss from agriculture and other human endeavors—and navigating the trade-offs and solutions to those impacts.
As a fundamentally cultural, social, and political project, ecomodernism is not and should not be beholden to redefinition of core ideas by any scientific subgroup. And that’s why we all should push back on the prospects that a small group of people—scientists or otherwise—could move to redefine and therefore constrain the strategies and decisions of those who must make the “real world” decisions moving forward. The Anthropocene can continue to serve as a common ground for science and society. Or it can be the technical workspace of stratigraphers, with their own narrow interests, dominating the broader interests of society in a way that matters to all of us on this planet.
“The Anthropocene” is most powerful when it’s used as a point of reflection and expansion, rather than reduction and precision. Non-stratigraphers use the term because it provides a shortcut to pre-existing beliefs about the human-environment relationship, while pretending we have universal normative goals. This is precisely why defining it as an “event” rather than an “epoch” could corral the concept while not restraining it. It is by contesting the term, but not infinitely, that it becomes powerful.
We might miss planet Pluto, but this is Earth. “The Anthropocene” has been a meeting place where we make sense of our relationship to it. Giving up ownership of this resonant term to a place, date, and particular group of people robs it of its broader social and cultural importance. When made accessible, to paraphrase Mark Twain, “the Anthropocene” can be a place to dream other dreams, and better.
* April 24: This article has been updated to correct the date Andy Revkin coined the "Anthrocene."