“Through the machine, man in Socialist society will command nature in its entirety, with its grouse and its sturgeons. He will point out places for mountains and for passes. He will change the course of the rivers, and he will lay down rules for the oceans. The idealist simpletons may say that this will be a bore, but that is why they are simpletons.”
— Leon Trotsky, Literature and Revolution (1924)
The next ice age was supposed to arrive no earlier than 50,000 years from now.
Instead, humans have accidentally, unwittingly, blithely stumbled into delaying its arrival by about the same amount of time.
During the past 2.5 million years, there have been dozens of small ice ages. These ice ages, or glaciations, are set up by periodic wobbles (precession), tilts, and eccentricity in the Earth’s orbit, resulting in small changes in the amount of incoming solar radiation. Such changes kick off a series of ice, oceanic, and atmospheric feedback that produces glaciation. Low solar radiation at high latitudes means that some ice that otherwise would have melted sticks around, building up over time into an ice sheet. These “Milankovitch cycles,” named after Milutin Milanković, the scientist who first proposed them, were confirmed in 1976 by a trio of scientists who used marine sediments to construct long-term climate records.
Something of a mystery remained in the data, however, in that sometimes low insolation (exposure to solar radiation) was not followed by an ice age. The explanation was long presumed to lie in the concentration of greenhouse gases in the atmosphere: carbon dioxide levels had to be sufficiently low that the planet would cool enough for an ice age to begin. In 2016, researchers at the Potsdam Institute for Climate Impact Research finally confirmed this to be the case. Deploying an Earth system model with an impressive array of moving parts simulating the atmosphere, ocean, ice sheets, and global carbon cycle, they found that we had been due for glacial inception just before the Industrial Revolution.
However, even without human combustion of coal, oil, and gas, human activity had already postponed another true ice age by 50,000 years. Earlier modeling results had found that preindustrial greenhouse gas concentrations were sufficiently high to avert glaciation even with low insolation. Archaeological and geological evidence suggests that early agriculture on the Eurasian continent (in particular, forest clearance some 8,000 years ago and rice irrigation 5,000 years ago) caused an estimated average global warming of about 0.8°C, sufficient to forestall the next ice age. In view of all the modern era’s combustion of fossil fuels, the Potsdam researchers concluded that the next ice age had been pushed back by another 50 millennia.
This tight link between greenhouse gas concentrations in the atmosphere and glaciation offers a tantalizing inversion of our current predicament, as we attempt to eliminate all greenhouse gas emissions in the face of global warming this century. But in 100,000 years (at the possible advent of the next ice age), we will surely want to emit as much as possible for exactly the same reason: preservation of the temperate conditions under which humans have flourished but that are actually quite unusual in the history of the Earth.
This prospect serves to remind us that when we work to prevent dangerous climate change, arrest biodiversity loss, better supervise nitrogen fertilizer use, limit plastic pollution, work to understand endocrine disruptor chemicals, or avoid stratospheric ozone depletion (or manage any of the other ecological challenges we have faced and overcome or are facing and have yet to overcome), we are ineluctably engaging in an anthropocentric activity. We are working to preserve — or even optimize — the set of environmental conditions that are valuable to humans.
In fact, humans have been altering planetary conditions to suit our own ends for millennia. But today, the outcome of that anthropogenic activity remains far from certain. If the past few decades is any indication, we are rapidly approaching a world changed dramatically by our own inadvertent activity. The problem we face is not that we are transforming the Earth system on the scale of geochemical cycles, but that we are doing so unintentionally. We should be doing so — as long as it is in our long-term benefit — on purpose.
There is a widespread belief, particularly on the green Left, that nature is governed by an essentially static balance into which greedy, self-regarding humans intrude. In this narrative, environmental problems are born of an anthropocentric or “speciesist” mindset. We must overcome this human supremacy, such activists argue, just as we work to overcome white supremacy and misogyny. We must replace anthropocentrism with “biocentrism,” a view in which all species hold equal moral worth. A recent New York Times opinion piece titled “Would Human Extinction Be a Tragedy?” by Clemson University philosopher (and adviser to the television show The Good Place) Todd May takes biocentrism to its limits. He concludes that, given our capacity for species extinction, perhaps it would be best for the planet if human civilization extinguished itself.
It is curious, however, that so many of those who are concerned with “saving the planet” seem unaware that on a geological timescale, the story of the planet is one of constant dynamic transformation. The current set of benign conditions that humanity is used to and upon which we depend is not the norm — there is no norm.
The Paleocene epoch some 56–66 million years ago experienced the warmest conditions of the Phanerozoic eon (wherein animals and plants have been abundant). During what is termed the Paleocene–Eocene Thermal Maximum (PETM) 55.5 million years ago, palm trees grew in Alaska and crocodiles swam in the Arctic. Since then, a series of tectonic events have unfolded. These include the collision of India into Asia (creating the Himalayas, thereby dramatically increasing weathering and erosion and thus the consumption of atmospheric CO2); the widening gap between South America and Antarctica (essentially isolating the Southern Ocean from the warmer waters of other oceans); and the volcanic creation of the isthmus of Panama, which separated the Pacific from the Atlantic and restricted the flow of heat from the tropics to the poles. As a result, there has been a steady decline in global temperature of about 8°C as well as a major overall decline in biodiversity in the Arctic.
From a biocentric perspective, then, must we adopt the position that the PETM was morally superior to the geological periods that followed? If so, who are we to arrest this “restoration” of PETM conditions, even if we are the cause of this thermal reversal? Progressing further along this line of biocentric logic, we must also ask whether we can even deem any period or condition superior to any other in the first place, given that the story of the Earth is one of constant dynamic flux.
Another commonplace among advocates of biocentrism suggests that humans are uniquely disruptive — that we are the first organism in Earth’s 4.5 billion–year history to reorder geochemical cycles. But that is not strictly true.
Some 2.5 billion years ago, over the course of about 200 million years, cyanobacteria were responsible for the Great Oxygenation Event (GOE), which has been described as the most catastrophic moment in the history of cellular life. Cyanobacteria (sometimes called blue-green algae) are a phylum of bacteria that were the first organisms to photosynthesize, that is, to split water molecules using energy from sunlight and produce free oxygen as waste. This tremendous energy-boosting adaptation gave them a huge metabolic leg up, resulting in the near extinction of almost all other, mostly anaerobic, life on Earth. Where obligate anaerobes (those harmed by the presence of oxygen) did not retreat to oxygen-poor niches, they were wiped out. If this weren’t enough, all the cyanobacterial oxygen “pollution” reacted with methane, a greenhouse gas some 25 times stronger than CO2. This interaction resulted in a sharp drop in atmospheric methane, which in turn plummeted temperatures, producing the first known, most severe, and longest ice age — a snowball Earth that lasted 300–400 million years.
Smaller taxa of microbes have been implicated in other mass extinctions. But it’s not just bacteria that are ecocidal super-killers. The Late Devonian mass extinctions 376–360 million years ago may have resulted in part from the conquest of land by plants with vascular systems that allowed root networks to penetrate and break up rock, creating the first soil. This process was assisted by chemical weathering: the production of organic acids by mycorrhizae (symbiotic root fungi) and bacterial decomposition. Organic matter (nutrients) then washed into streams and out into the ocean, causing excessive growth of algae and subsequent depletion of oxygen in the water, in turn suffocating fish — very much akin to contemporary hypoxic algal bloom “dead zones” caused by agricultural runoff, only lasting for thousands of years. The development of seeds, allowing plants to establish themselves further inland, only exacerbated the situation. Completing the ecocidal hat trick of the first trees and forests, these plants may have drawn down sufficient CO2 to cause continental glaciation and thus dramatic drops in sea levels that further stressed coastal life.
The full story of the Devonian extinctions, which wiped out an estimated 75 percent of species, is far from settled. Other complementary and competing causes have been suggested, including volcanism, asteroid impacts, and the merger of landmasses. But if vascular plants and their fungal friends were indeed in the driver’s seat of this mass extinction, then, as with the GOE, we have to concede that evolution contains within itself the capacity to extinguish life on vast scales.
Put another way: mass extinctions are natural. But it is only by adopting the naturalistic fallacy — the belief that what is, is what ought to be — that anyone comes to the conclusion that natural equals good.
Was the GOE or the Late Devonian mass extinction event — or any other extinction event, for that matter — good or bad? Is the assemblage of life before and after the GOE, or the Late Devonian extinctions, better or worse? Ironically, if we adopt the biocentric point of view that many environmentalists recommend, it is neither. Previous extinction events just become a part of the story of life. Whether biotic or abiotic in origin, such events appear to be as much a part of evolution as death is a part of, and requirement for, life. Beyond human valuation, extinction comes to have no moral sense at all.
From the much maligned anthropocentric point of view, on the other hand, we can make such judgments. For us, these past extinctions were very good indeed, since neither we nor the assemblage of species we depend on would be here otherwise. Yet, once again from the anthropocentric point of view, a sixth mass extinction event would be very, very bad indeed. But bad for us alone, for we are unlikely to survive it. And even if we did survive — or even flourish despite it — it would still be bad, for the aesthetic pleasure we receive from this assemblage.
Similarly, with respect to climate change, an anthropocentric view allows us to assess the desirability of possible outcomes. As geological scientists have made clear, there is a set of climatic and biotic conditions — “Goldilocks conditions” — that, although rare in the history of the planet, over the past 10,000–20,000 years or so have allowed humans to flourish. It appears that going beyond 2°C of warming above preindustrial times threatens to unleash some pretty calamitous conditions that would limit human flourishing. And that is why we need to take climate change seriously: because of us.
The novelty of the current biocrisis, compared with those of the deep past, is that no other organisms could make a choice about whether to continue or stall their disruptive development. During the GOE, cyanobacteria couldn’t hold a global political discourse on whether to keep photosynthesizing. And during the Late Devonian extinction, vascular plants couldn’t write newspaper opinion pieces for or against creating soil.
But we humans can make the choice to switch our energy use away from fossil fuels. We already made the choice to ban chlorofluorocarbons (CFCs) to successfully preserve the ozone layer. Rather than flagellating ourselves for how uniquely ecologically disruptive we are, we should instead be coming to terms with our own agency. Maintaining the world we know and love — and have in many ways built — will require more initiative, not less.
The notion of optimizing the Earth for human flourishing may at first glance seem rebarbative. But even the language of the most environmentally conscious—those very well-meaning, committed activists who are most deliberately trying to achieve a biocentric mindset and leave behind anthropocentrism—is often unconsciously anthropocentric in this regard. They sound this way most clearly when they, quite rightly, argue that urgent climate action is required to prevent an increase in extreme weather events such as hurricanes and coastal storms, and to prevent sea-level rise.
Humans have existed under and adapted to a wide range of differing sea levels. What we now call Britain and the Netherlands were once connected via Doggerland, an above-sea-level region that was rich in human habitation in the Mesolithic period. Then, some eight-and-a-half millennia ago, sea levels gradually rose, forcing humans to migrate away. Gradually is the key word here. Unlike past sea-level rises, anthropogenic climate change promises abrupt sea-level rise, which poses grave challenges to the series of coastal megacities that make up much of the modern world.
In the case of imminent sea-level rise, most of these cities would be inundated if they did not make vast expenditures on dikes and land reclamation, diverting those funds from other public goods (such as healthcare, childcare, pensions, a shortened work week, expansion of the arts or scientific research, or a more robust space exploration program). In underdeveloped nations, sea-level rise becomes yet another costly barrier to advancement, or, as is already happening in some Pacific nations, threatens to extinguish them completely.
Today, the global response to such threats engendered by climate change has been consistent, if underwhelming: competing nation-states occasionally kludge together lowest-common-denominator treaty-based efforts, such as the Kyoto Protocol and the Paris Agreement. They are enforced only via weakly accountable structures of global governance, such as the UN Framework Convention on Climate Change (UNFCCC) Conference of the Parties process.
Unfortunately, we have little to nothing to show for these efforts. In 2018, Spencer Dale, the chief economist for the BP Statistical Review of World Energy — the gold standard for energy and carbon emissions data — issued an unusually frank and personal statement accompanying his latest report. It showed that, despite all the efforts of climate diplomacy, coal’s share of the global energy mix in 2017 was identical to what it was in 1998 — the year the Kyoto Protocol was signed. Even more discouraging, the share of non-fossil-derived energy had actually declined. As he lamented, “We have stood still: perfectly still for the past 20 years.”
Treaties are by nature consensus-driven, not majoritarian (i.e., democratic), in their politics. Consensus politics produces lowest-common-denominator results, very slowly — the minimum that every party can accept, from Lichtenstein to China, who all have to agree. The problem is that what everyone can accept is not necessarily adequate to the task. The ban on CFCs that the Montreal Protocol achieved in the late 1980s was possible only because the requisite technology-switching affected a very small number of sectors, and the impact was barely noticeable by citizens, whether in terms of job losses or quality of life.
The greater and more complex the technology-switching challenge, conversely, the poorer the ability of the intergovernmental treaty model to respond. Efforts to construct an intergovernmental nitrogen-pollution management regime that are under way with the twin bodies of the UNFCCC and the Intergovernmental Panel on Climate Change (IPCC) as models, for instance, is likely to face similar stagnation as have energy-transition efforts, given the complexity of agricultural pollution solutions compared with the CFC fix.
Perhaps even more worrisome is that treaty-based intergovernmentalism is also sub-democratic, and citizens have begun to take note.
The European Union, perhaps the most advanced and integrated of all intergovernmentalist structures, is currently beset by a crisis of legitimacy, a phenomenon political scientists have termed the EU’s “democratic deficit.” The same sense of a lack of democratic control that led to Brexit seems to be held widely across the bloc. Laws are handed down from Brussels, over which people feel they never had a say, and certainly didn’t vote for. Legislation is initiated not by elected representatives but by a civil service, the European Commission. The powerful Council of Ministers serves as a legislative chamber for which there is no general election, and through which legislative deliberation happens in secret, closed off to both the press and the citizenry. The sole directly democratic chamber, the EU Parliament, has no power to initiate legislation, only to respond to it.
The ongoing riotous explosion of rage of the gilets jaunes (yellow vests) movement across peri-urban and rural France began in response to a fuel-tax hike intended, in part, to service European climate targets. But the background and explanation for the scale of the fury is certainly the EU’s continent-wide imposition of austerity economics that, again, citizens feel they had no say over.
If we continue to lean on treaty-based intergovernmentalism to solve environmental problems, and if we continue to practice global governance in the absence of global government, then our progress will not only be slow and inadequate but will also stall in the face of the wrath of citizens who recognize its lack of democratic legitimacy. With a system of Westphalian nation-states rather than countries represented in a global parliament, we simply do not have a worldwide democracy capable of responding to climate change in a globally responsive and equitable way. What we need, instead, is the capacity for global majoritarian decision-making: a worldwide democratic state.
It is not only the regnant political order that needs replacing, however. In today’s global capitalist world, the market vastly constrains our choices, perhaps no less than genetics constrains cyanobacteria and vascular plants.
The Left quite rightly spends a great deal of time critiquing markets for their production of inequality, their antipathy toward democracy in the workplace, and their role in the perpetuation of war and oppression. But there is another historic critique of markets that progressives sometimes forget. What Marx called the “fettering” of production, we might term “irrational production”: the profit-based logic of markets that limits production to a smaller set of goods and services than would otherwise rationally be produced, and promotes the continued production of goods and services that should not rationally be produced.
The increasing threat of antibiotic resistance provides one exemplar. Every expert — from the former head of the US Centers for Disease Control and Prevention, Thomas Frieden, to the former Chief Medical Officer for England, Dame Sally Davies — agrees on the cause of the problem. For the past three or four decades, large pharmaceutical companies have largely gotten out of the business of researching and producing new classes of antibiotics because it is more profitable to develop drugs for chronic diseases that must be taken every day. No immorality is at work here, just the market’s irrational amorality.
Today, similarly, so long as fossil fuels remain profitable, firms that produce them will have an incentive to continue to do so. Likewise, no matter how environmentally beneficial some other good might be, if it is not profitable, or even if it is insufficiently profitable, it will not be produced.
Arresting this kind of “irrational production” as it shackles our efforts to address climate change will require some form of nonmarket intervention, whether by way of regulation or popular opposition. The great success stories of environmentalism — ozone layer restoration, global forest cover increase over the past three decades, and elimination of acid rain over the Great Lakes — were all achieved primarily by way of nonmarket mechanisms that enforced technology-switching.
In this context, it is crucial to understand that the market is at fault for our current predicament, not growth — as far too many on the green Left (and even many establishment figures such as Lord Attenborough and Prince Charles) contend. Those who argue for limits to growth or, even degrowth, forget the historic battle that the socialist Left dating back to Friedrich Engels mounted against Malthusianism Worse, they neglect the impossibly severe implications of their claims. Former World Bank economist and leading expert on global inequality Branko Milanović has performed a back-of-the-envelope calculation estimating that if we did as the limited-growth advocates demand, while eliminating economic equality (by apportioning all the world’s wealth equally among the roughly 7.7 billion of us), each person would receive an annual income of $5,500. This would be such a radical constriction of Western workers’ standard of living that the austerity and wage restraint of a Margaret Thatcher or Ronald Reagan would appear benign by comparison.
Beyond the injustice that would result from a steady-state economy, let alone actual degrowth, by targeting growth instead of the market, the green Left and its allied figures have lost sight of the real problem at hand. We did not save the ozone layer by limiting growth in the production of fridges and cans of hairspray. It was regulatory intervention in the market that did the trick. It was planning, in other words — global economic planning.
Of course, such a political–economic transformation at the scale required to avert climate change might seem farfetched in a world governed by the economic logic of the market and the political imperatives of the Westphalian nation-state. But would the call for a political economy that permits a democratically planned optimization of the Earth system require any less of a political and economic revolution than that demanded by neo-Malthusian advocates of limits to growth or even degrowth?
Nevertheless, the fact remains that in a globalized world in which climate change must be addressed, social democracy will not survive so long as it continues to operate within the context of the intergovernmentalism that has dominated climate mitigation efforts thus far. Our capacity to sustain both social democracy and the Goldilocks conditions in which that profoundly human accomplishment has thrived will require that we move beyond the nation-state and toward a system of global democratic governance. This system should possess both the legitimacy of an elected body and the capacity to channel human agency into a massive, rapid, and coordinated response to climate change.
There are precedents for the kind of multinational, multigenerational struggle that the emergence of global democratic governance will require. The fight for women’s equality has been going on for more than a century; for all the incredible progress made around the world, we still have very far to go. The movement for a democratically planned world system should be thought of in the same way: as a sustained effort that builds toward an overarching goal over decades or even centuries, rather than as an event akin to the French or Russian Revolution that bursts onto the scene on a single transformational date.
For over one hundred years, internationalism has been a watchword of the Left. But for too long, it has been put off for the future, or invoked in a vague gesture of humanism or universalism that has little to offer to solve the wicked problems of the 21st century. There can be little doubt that the ecological crises we face today demand that global democratic planning stand at the center of progressive thought in the Anthropocene.
The Earth before humanity was directionless, a ship without a captain. It didn’t care whether it extinguished itself. Today, the market left to its own devices, and a planet without a democratic government, likewise leaves our world without a hand on the tiller. But we humans do care, deeply, about our own flourishing, and it lies in our capacity to put our hand on that tiller. If we want to confront climate change and biodiversity loss globally — if we want to preserve our anomalous, anthropocentric, and increasingly anthropogenic set of Goldilocks conditions — we must come to terms with the fact that what is required of us is nothing short of global democratic governance: ruling the Earth system in the interest of maximizing human flourishing, of expanding our freedom without bound.