In June 30, 1860, Samuel Wilberforce, DD, 36th Bishop of Oxford, attended the 30th annual meeting of the British Association for the Advancement of Science, held at Oxford University. Countless students have been taught that during the meeting, Wilberforce attacked evolution, setting off an impromptu debate which became a “tipping point” in the history of thought. I was one of those students. The debate, my biology professor explained, was the opening salvo of the war between Science and Religion — and Religion lost. My textbook backed him up. Wilberforce’s anti-evolution assault, it said, was swept aside by researchers’ “careful and scientific defense.” In a flourish unusual in an undergraduate text, it boasted that the pro-evolution arguments “neatly lifted the Bishop’s scalp.” That day the forces of empirical knowledge had beaten back the armies of religious ignorance.
None of this is accurate. There was no real debate that day in Oxford. Nor was there a clear victor, still less a scalping. Not that many people were paying attention; the “debate” was not mentioned by a single London newspaper. Still, the exchange was important, though the quarrel was less between scientific theory and religious faith than between two conceptions of humankind’s place in the cosmos. And far from being an enduring victory for rationality over faith, the debate inaugurated a conflict which continues to the present day, and is less about the past than about the future.
In 1860 science was not assumed to be inaccessible to ordinary people; the attendees at the Advancement of Science meeting included many ordinary, middle-class Britons, as well as Oxford students and faculty. The crowd packed a hall at the university’s new museum, standing in aisles and doorways. In that jammed, sweltering space, the subject on attendees’ minds was On the Origin of Species, by Charles Darwin. Published just seven months before, the book had created a public uproar, dividing educated Britons into pro- and anti-evolution camps.
Wilberforce’s friends believed him to be an obvious candidate to lead the charge against Darwin. Ambitious, witty, and politically connected, the 54-year-old cleric had a reputation for such smooth and convincing oratory that his detractors mocked him as “Soapy Sam.” His allies were sure that an eloquent public condemnation from him would deal a severe blow to Darwinism.
As the bishop may have known, another audience member was poised for a counterstrike: Thomas Henry Huxley, almost 20 years Wilberforce’s junior but already known as much for his vehement defense of Darwin, a friend, as for his contributions to comparative anatomy. A poor boy who had never been able to finish his university degree, Huxley had risen to a full professorship through ambition, brilliance, and dogged work. Prickly and quick to take offense, he enjoyed a good, vicious fight with plenty of character assassination.
In the version of events retailed in my college class, the bishop spoke for half an hour, his theatrical, booming voice filling the hall with Darwinism’s supposed flaws. Much of the audience was delighted; every barb drew cheers and approving laughter. Goaded, perhaps, by the crowd, the bishop closed his peroration with the kind of snide jab that he would never have used in the pulpit. Turning to Huxley with a saponaceous smile, he asked grandly whether it was “through his grandfather or his grandmother that he claimed his descent from a monkey.”
Huxley (my professor said) was delighted by this sally. He whispered to a neighbor, “The Lord hath delivered him into mine hands” — a line, fittingly enough, from Scripture. Then he stood. No, Huxley said, he would not be ashamed to have an ape for an ancestor, but he would be ashamed “to have sprung from one who prostituted the gifts of culture and eloquence to the service of prejudice and of falsehood.”
Like the rest of my class, I chuckled when I heard this exchange. But I couldn’t fathom why it counted as a victory for science. I understood that the bishop had overstepped by intimating that one of Huxley’s grandparents had been sexually involved with an animal. Huxley had used this gaffe as an opening to strike back with an even more direct personal swipe. But neither man had said anything substantive about evolution. How could this be an advance for rational thinking?
The quarrel was less between scientific theory and religious faith than between two conceptions of humankind’s place in the cosmos.
No transcript exists of Wilberforce’s remarks. But he had just written an 18,000-word takedown of Origin, soon to be published in a prominent literary journal. Most historians believe that at Oxford the bishop simply laid out the criticisms in his review. If so, Huxley faced a challenge, because Wilberforce’s review was far from ignorant. Indeed, Darwin later admitted that it was “uncommonly clever; [the review] picks out with skill all the most conjectural parts, & brings forward well all difficulties.” The bishop’s facility with scientific argument was unsurprising: he had a first-class mathematics degree from Oxford and, like Darwin, was a member of the Royal Society, Britain’s premier scientific body.
Part of the bishop’s cleverness lay in his decision to attack Darwin mainly on scientific grounds, rather than invoking Christian dogma. From the beginning, Wilberforce targeted Origin’s greatest weakness: the paucity of direct evidence for the evolution of one species from another. If the history of life were filled with these “transmutations,” the bishop reasoned, it must also be filled with in-between beasts, halfway evolved between old and new. Fossils of these in-between creatures should be everywhere. “Yet never have the longing observations of Mr. Darwin and the transmutationists found one such instance to establish their theory.” If evolution was real, where were these intermediate entities?
Only after spending more than 15,000 words critiquing the evidence for the existence of evolution did Wilberforce turn to his main concern: natural selection, the proposed mechanism for evolution. At bottom, the concept of natural selection is so simple that Huxley later claimed that his reaction to learning it was “How extremely stupid not to have thought of that!” Darwin contended that some offspring are, by chance, different from their parents; that some of these random differences — somewhat stronger muscles, say — will be beneficial (others will not be); and that individuals with the favorable variations will have a better chance of reproducing and passing on these favorable variations. In this way, Darwin argued, natural selection ensures that randomly appearing, advantageous features spread through populations. The process ensures that all species continually evolve through time, eventually giving rise, as the changes accumulate, to new species.
In the closing paragraphs of Origin, Darwin summed up his thoughts with an image: a hillside of untidy foliage that he often walked by. He asked readers to picture this “tangled bank,” as he called it, alive “with many plants of many kinds, with birds singing on the bushes, with various insects flitting about, and with worms crawling through the damp earth.” Although the inhabitants of the hillside — that is, the earth’s living creatures — are “so different from each other, and dependent upon each other in so complex a manner, [they] have all been produced by laws acting around us.” The important words here are “all” and “produced.” Living things may look dissimilar to the casual eye, but they are identical on a deeper level — all of them. Each and every one was produced by natural selection, and natural selection will determine its future.
Each and every one — that would include human beings, wouldn’t it? Here Darwin ducked. Throughout Origin, he sedulously avoided discussing whether his ideas also applied to people — if, that is, Homo sapiens were just another weed on his tangled bank.
Darwin’s reticence didn’t fool the bishop. If natural selection directs the course of life and people are part of life, Wilberforce wrote in his review, the clear implication is that “the principle of natural selection [applies] to MAN himself.” (Note the sudden burst of capital letters; the bishop was thundering in disapproval.) Human beings, too, must have evolved through natural selection from some previous, not-quite-human species. And this notion, Wilberforce said, is “absolutely incompatible” with a true understanding of the “moral and spiritual condition of man.”
Human beings, the bishop believed, had been created by God and endowed with a unique spark. But if, as Darwin suggested, people were created by unthinking natural forces, they could not possibly have any high standing. Should Homo sapiens share its nature with all other creatures — should our species be, as Wilberforce facetiously suggested, just a bunch of overachieving “mushrooms” — we would be, by definition, nothing special.
As the University College London geographer Simon Lewis has pointed out, Darwin’s argument — that humankind owed its existence to the same processes that gave rise to flatworms and amoebae — was a second, biological Copernican Revolution. The original Copernican Revolution is usually said to have begun in the early 16th century, when the Polish-German polymath Nicolaus Copernicus, drawing on data from Arab and Persian geometers, proposed that the earth orbits around the sun, and not the sun around the earth. Because the earth moved, it could not be, as had been thought, the focal point of the cosmos. The Copernican Revolution, the science historian Dick Teresi has noted, was neither a revolution, in the sense that it occurred over a long time, nor particularly Copernican, in that it was also the product of thinkers other than Copernicus. Still, it had great impact on our conception of ourselves and our place in the cosmos. Earth, our home, was no longer the pivot of existence. It was just a place, one among many, without particular distinction.
Unlike the first Copernican Revolution, the second happened rapidly and was largely the product of a single mind — Charles Darwin. But it, too, nudged our species out of the spotlight. “We are not even at the heart of life on Earth,” as Lewis put it. Because humankind owed its existence to the same processes that produced every other organism, Darwin implied, Homo sapiens was a species like any other, without particular distinction. This new Copernican Revolution was what had attracted Wilberforce’s ire.
Should Homo sapiens share its nature with all other creatures, we would be, by definition, nothing special.
To the bishop, there was a fundamental line between human beings and all other creatures. Naturally, he described that difference in Christian terms: people had souls, animals did not; people were endowed by God with the capacity for change and redemption, animals were not. But Wilberforce’s view can be put in broader, more general terms, which do not depend on religious belief: Homo sapiens has an inner flame of creativity and intelligence that allows it to burn down barriers that would trap any other species. Or, more succinctly: human beings are not wholly controlled by the natural processes that control all other creatures. We are not simply another species.
Wilberforce’s remark about Huxley’s ape ancestors was thus more than a snarky gibe. Consciously or not, the bishop was asking whether Huxley was prepared to affirm that he and all other people were prisoners of biology. Blinded by contempt, Huxley seems not to have realized that his adversary was posing, however rudely, an important question. (The “great question,” the conservationist George Perkins Marsh called it a few years later: “whether man is of nature or above her.”) Not grasping the underpinnings of the dispute, Huxley didn’t even try to engage them. Darwin later shuddered at the “awful battles which have raged about ‘species’ at Oxford,” but there was no actual debate. At least not in the sense of a genuine attempt to hash out diverging beliefs.
Both Huxley and Wilberforce thought they had come off well. Three days after the encounter, the bishop bragged to a friend, “I think I thoroughly beat him.” Certainly his supporters in the audience, “cheering lustily,” thought so. Equally pleased, Huxley later boasted that he “was the most popular man in Oxford for full four and twenty hours afterwards.” In the years to come, Huxley and Wilberforce ran into each other from time to time. The meetings were always cordial. Both viewing themselves as the winner, they could be magnanimous in victory.
Over the decades, Huxley came to be seen as triumphant. In his school in the 1960s, the writer Christopher Hitchens was taught that “Huxley cleaned Wilberforce’s clock, ate his lunch, used him as a mop for the floor, and all that.” In my college a few years later, I learned much the same thing. The Wilberforce-Huxley dustup was presented as a morality play ending in a straightforward victory for rational thought. Only much later did I realize that from today’s perspective the implications of our species’ lack of specialness were different from what my teacher and textbook had presented.
In Wilberforce’s day, those who hoped for a better future cheered on Huxley because science and technology seemed to promise a better life. But now that science and technology have allowed the human enterprise to risk its own survival, the partisans of hope have stepped back from some of Huxley’s implications. If the Oxford debate was a morality play, the vices and virtues have slipped off stage and switched masks.
Lynn Margulis was a Huxleyite. A researcher who specialized in cells and microorganisms, Margulis was one of the most important biologists in the last half-century. Until her death in 2011, she lived in my town, and I would bump into her on the street from time to time. Homo sapiens, she once told me, is an unusually successful species. And it is the fate of every successful species to wipe itself out — that is the way things work in biology.
Margulis explained these ideas to me while talking about one of her scientific heroes, the Russian microbiologist Georgii Gause. In 1934, Gause published The Struggle for Existence, which today is viewed as a scientific landmark, one of the first successful marriages of experiment and theory in ecology.
By today’s standards, his methodology was simplicity itself. Gause decanted an oatmeal broth into small, flat-bottomed test tubes. Into each he dripped five Paramecium caudatum or Stylonychia mytilus, both single-celled protozoans, one species per tube. He stored the tubes for a week, during which time he observed the results.
What Gause saw in his test tubes is often depicted in a graph, time on the horizontal axis, the number of protozoa on the vertical. The line on the graph is like a distorted bell curve, with its left side twisted and stretched. By squinting a bit, it is possible to imagine that the left side of the curve is a kind of flattened S, which is why scientists often refer to Gause’s curve as an “S-shaped curve.” At the beginning (that is, the left side of the S-shaped curve), the number of protozoans grows slowly, and the graph line slowly ascends to the right. But then the line hits an inflection point, and suddenly rockets upward — a frenzy of growth. The mad rise continues until the organism begins to run out of food, at which time there is a second inflection point, and the growth curve levels off again as bacteria begin to die. Eventually the line descends, and the population falls toward zero.
It is the fate of every successful species to wipe itself out — that is the way things work in biology.
Years ago I watched Margulis demonstrate Gause’s conclusions to one of her classes with a time-lapse video of Proteus vulgaris, a bacterium that resides in the intestinal tract. Left alone, it divides about every 15 minutes, producing two individuals where before had been one. Margulis switched on the projector. Onscreen was a tiny dot — P. vulgaris — in a shallow, circular glass container: a petri dish, its bottom covered with a layer of reddish nutrient goo. The students gasped. In the time-lapse video, the colony seemed to pulse, doubling in size every few seconds, rippling outward until the mass of bacteria filled the screen.
By luck or superior adaptation, a few species manage to escape their limits, at least for a while. Nature’s success stories, they are like Gause’s protozoans; the world is their petri dish. Their populations grow at a terrific rate; they take over large areas, engulfing their environment as if no force opposed them. Then they hit a barrier. They drown in their own wastes. They starve from lack of food. Something figures out how to eat them.
When Margulis told me that human beings, like Gause’s protozoa, would wipe themselves out, she was affirming her belief in Darwin’s view: biological laws apply to every creature. After talking with her, I sometimes told people about these ideas. Few accepted them, and even those who agreed did not fully endorse Margulis’s perspective. They told me that the human race was doomed because people are greedy and stupid, not because, as Margulis thought, overreaches and crashes are the natural way, as much a part of the wonders of life as coral reefs and tropical forests. But I also never met anyone who had a convincing argument that she was wrong.
Over the years, as my journalistic assignments accumulated, it seemed to me that the responses to my questions fell into two broad categories. Wizards and Prophets, as I have come to call them, each have a separate blueprint for the future. Prophets look at the world as finite, and people as constrained by their environment. Wizards see possibilities as inexhaustible, and humans as wily managers of the world. One views growth and development as the lot and blessing of our species; the other regards stability and preservation as our future and our goal. Wizards regard the earth as a toolbox, its contents freely available for us; Prophets think of the natural world as embodying an overarching order that should not lightly be disturbed. But both assume that Wilberforce, not Huxley, was correct — that human beings are special creatures who can escape the fate of other successful species.
A year after Margulis’s death in 2011, I bumped into Daniel B. Botkin, an ecologist who had recently retired from the University of California, Santa Barbara. Botkin has worked in many areas but is perhaps best known for Discordant Harmonies (1990), a classic study debunking the long-held belief that ecosystems will exist in a timeless balance unless people disturb them. He had known Margulis well and respected her. “But she’s wrong on this one,” he said.
Not all species would multiply themselves out of existence if given the chance, he said. Among the exceptions is the whooping crane, Grus americana. The subject of one of the longest conservation efforts in North America, the whooper, as it is called, is a sister species to the Eurasian crane, Grus grus; geneticists believe the two species split off from a common ancestor one to three million years ago. Despite their physical similarity, the birds behave differently. Hundreds of thousands of Eurasian cranes exist, despite human hunting. The bird aggressively expands its territory when possible, sometimes infuriating farmers by taking over their fields. Whoopers, by contrast, are shy creatures of the marsh, rarely seen in groups bigger than two; as far as is known, the entire species has never numbered more than 1,500 individuals. “Explosive growth is evidently not part of its evolutionary strategy,” Botkin said.
There are other examples — not many, but they exist. Another example, from Botkin: the Tiburon mariposa lily (Calochortus tiburonensis). Native to northern California, it lives only on soils made from serpentine, a relatively rare kind of stone that produces soils filled with chromium and nickel, which are toxic to most plants. Serpentine soils occur usually in isolated patches with relatively defined borders — natural petri dishes, one might say. The lily reproduces slowly enough that it never overwhelms its environment. It never hits the edge of the petri dish.
Was there any known case, I asked Botkin, of a species changing its evolutionary strategy? A creature that went from rapid, Gause-style expansion to quiet adjustment to its environment? Of a protozoan transforming itself, so to speak, into a whooping crane? Or of a plant that somehow makes its own serpentine soil?
“That’s the question, isn’t it?” Botkin said.
One possible answer to the question is provided by Robinson Crusoe, hero of Daniel Defoe’s famous novel. Shipwrecked alone on an uninhabited island off Venezuela in 1659, Crusoe is an impressive example of fictional human resilience and drive. During his 27-year exile, he learns to catch fish, hunt rabbits, tame goats, prune citrus trees, and create “plantations” of barley and rice from seeds salvaged from the wreck. (Defoe didn’t know that citrus and goats were not native to the Caribbean and thus probably wouldn’t have been on the island.) Rescue comes in the form of a shipful of mutineers, who plan to maroon their captain on the supposedly empty island. Crusoe helps the captain recapture his ship and offers the defeated mutineers a choice: permanent exile on the island or trial in England. All choose the island. Crusoe has harnessed so much of its productive power to human use that even a gaggle of inept seamen can survive there in comfort.
Robinson Crusoe’s first three chapters recount how its hero ended up on his ill-fated voyage. The youngest son of an English merchant, Crusoe has a restless spirit that leads him to become an independent slave trader. On a voyage to Africa, his ship is captured by a “Turkish rover” captained by a Moor from Morocco. “As his proper Prize,” Crusoe becomes the captain’s house slave. After two years of servitude, Crusoe steals his master’s fishing boat and escapes. He bumbles in the boat down the West African coast without food or water and is rescued by a Portuguese slave ship bound for Brazil. There the enterprising Crusoe establishes a small tobacco plantation. But he is short of labor, and decides with some other plantation owners to obtain that labor by taking a ship to Africa and buying some slaves. The ship wrecks on the return voyage. Except for Crusoe, all hands perish, slaves included. He ends up alone on his island.
What is striking to a modern reader is that Defoe saw nothing remarkable about expecting readers to sympathize with a man in the slave trade. Crusoe has no qualms about slaving even after having been, most unhappily, a slave himself. Here, character echoes author: Defoe extolled slavery as “a most Profitable, Useful, and absolutely necessary Branch of our Commerce.” Backing words with deeds, he owned shares in the Royal African Company, created in 1660 to buy men and women in Africa and transport them in chains to the Americas. When the company was attacked in Parliament, he offered to write the equivalent of editorials in its favor. It paid him the rough equivalent of $50,000 for his public-relations services.
Defoe was a person of his time. Three centuries ago, when he was writing Robinson Crusoe, societies from one end of the world to another depended on slave labor, as had been the case since at least the Code of Hammurabi in ancient Babylon. Customs differed from one place to another, but slavery was sanctioned and practiced everywhere from Mauritania to Manchuria. Unfree workers existed by the million in the Ottoman Empire, Mughal India, and Ming China. In classical Athens, two-thirds of the inhabitants were slaves; imperial Rome, the historian James C. Scott has written, “turned much of the Mediterranean basin into a massive slave emporium.” Slaves were less common in early modern Europe, but Portugal, Spain, France, England, and the Netherlands happily exploited huge numbers of them in their American colonies. In the last half of the 18th century alone, almost four million people were taken from Africa in chains. In colonies throughout the Americas at that time, in places ranging from Brazil to Barbados, from South Carolina to Suriname, slaves were so fundamental to the economy that they outnumbered masters, sometimes by ten to one.
Then, in the space of a few decades in the 19th century, slavery almost stopped entirely. The implausibility of this change is stunning. In 1860, slaves were the single most valuable economic asset in the United States, collectively worth more than $3 billion, an eye-popping sum at a time when the US gross national product was less than $5 billion. (The slaves would be worth as much as $10 trillion in today’s money.) Rather than investing in factories like northern entrepreneurs did, southern businessmen had sunk their capital into slaves. Rightly so, financially speaking — slaves had a higher return on investment than any other commodity available to them. Enchained men and women had made the region politically powerful, and gave social status to an entire class of poor whites. Slavery was the foundation of the social order. It was, thundered South Carolina senator John C. Calhoun, “instead of an evil, a good — a positive good.” (Calhoun was no fringe character; a former US secretary of war and vice president, he would become secretary of state.) Yet despite the institution’s great economic value, part of the United States set out to destroy it, wrecking much of the national economy and killing half a million citizens along the way.
An institution fundamental to human society for millennia was made over by ideas and a call to action, loudly repeated.
Incredibly, the turn against slavery was as universal as slavery itself. Great Britain, leader of the global slave trade, banned its market in human beings in 1807 after a tireless campaign by abolitionists. Two laws enacted in 1833 and 1838 freed all British slaves. Denmark, Sweden, the Netherlands, France, Spain, and Portugal soon outlawed their slave trades, too, and after that slavery itself. Like stars winking out at the approach of dawn, cultures across the globe removed themselves from the previously universal exchange of human cargo. Slavery still exists; the International Labor Organization estimates that almost 21 million people are still forced to work as captives. But in no society anywhere is slavery a legally protected institution — part of the social fabric — as it was throughout the world two centuries ago.
Historians provide many reasons for this extraordinary transition, high among them the fierce opposition of slaves themselves. But another important cause is that abolitionists convinced people around the world that slavery was a moral disaster. An institution fundamental to human society for millennia was made over by ideas and a call to action, loudly repeated.
In the last few centuries, such profound changes have occurred repeatedly. Another, possibly even bigger, example: Since the beginning of our species, almost every known society has been based on the subjugation of women by men. Tales of past matriarchal societies abound, but there is little archaeological evidence for their veracity. In the long run, women’s lack of liberty has been as central to the human enterprise as gravitation to the celestial order. The degree of suppression varied from time to time and place to place, but women never had an equal voice. Union and Confederacy clashed over slavery, but they were in accord on the status of women: in neither state could women attend school, have a bank account, or, in many places, own non-personal property. Equally confining in different ways were female lives in Europe, Asia, and Africa. Nowadays women are the majority of US college students, the majority of the US workforce, and the majority of US voters. Again, historians assign multiple causes to this shift, rapid in time, confounding in scope. But a central element was the power of ideas — the voices and actions of suffragists, who through decades of ridicule and harassment pressed their case. In recent years something similar may have occurred with gay rights: first a few lonely advocates, censured and mocked; then victories in the social and legal sphere; finally, perhaps, a slow movement toward equality.
No European in 1800 could have imagined that in 2000 Europe would have no legal slavery, women would be able to vote, and same-sex couples would be able to marry.
Equally profound is the decline in violence. Ten thousand years ago, at the dawn of agriculture, societies mustered labor for the fields and controlled harvest surpluses by organizing themselves into states and empires. These promptly revealed an astonishing appetite for war. Their penchant for violence was unaffected by increasing prosperity or higher technological, cultural, and social accomplishments. When classical Athens was at its zenith in the fourth and fifth centuries BC, it was ever at war: against Sparta (First and Second Peloponnesian Wars, Corinthian War); against Persia (Greco-Persian Wars, Wars of the Delian League); against Aegina (Aeginetan War); against Macedon (Olynthian War); against Samos (Samian War); against Chios, Rhodes, and Cos (Social War). Greece was nothing special — look at the ghastly histories of China, sub-Saharan Africa, or Mesoamerica. Look at early modern Europe, where war followed upon war so fast that historians bundle them into catchall titles like the Hundred Years’ War or the even more destructive Thirty Years’ War. The brutality of these conflicts is difficult to grasp; to cite an example from the Israeli political scientist Azar Gat, Germany lost between a fifth and a third of its population in the Thirty Years’ War — “higher than the German casualties in the First and Second World Wars combined.” The statistic is sobering: Germany lost a greater percentage of its people to violence in the 17th century than in the 20th, despite the intervening advances in the technology of slaughter, despite being governed for more than a decade by maniacs who systematically murdered millions of their fellow citizens.
As many as one out of every ten people met violent deaths in the first millennium AD, the archaeologist Ian Morris has estimated. Ever since, violence has declined — gradually, then suddenly. In the decades after the Second World War, rates of violent death plunged to the lowest levels ever seen. Today, humans are far less likely to be slain by other members of their species than a hundred years ago, or a thousand — an extraordinary transformation that has occurred, almost unheralded, in the lifetime of many of the people reading this article. Given the mayhem documented in every day’s headlines, the horrors in the Middle East and the strife in northeast Africa, the idea that violence is diminishing may seem absurd. Nonetheless, every independent effort to collect global statistics on violence suggests that we seem to be winning, at least for now, what the political scientist Joshua Goldstein calls “the war on war.”
Past successes do not guarantee future progress. Violence has ticked upward in the last decade, and may get worse. One can readily imagine some ghastly political or religious insurgency that reinstates slavery; many insurrectionary forces go out of their way to brutalize women. Global poverty has fallen dramatically in recent decades, but could rebound. Lunatics with nuclear weapons may yet strike — a possibility that will never go away. There is no permanent victory condition for being human, as the writer Bruce Sterling has remarked.
Given this record, though, even Lynn Margulis might pause. No European in 1800 could have imagined that in 2000 Europe would have no legal slavery, women would be able to vote, and same-sex couples would be able to marry. No one could have guessed that a continent that had been tearing itself apart for centuries would be largely free of armed conflict, even amid terrible economic times. No one could have guessed that Europe would have vanquished famine.
Preventing Homo sapiens from destroying itself à la Gause would require a still greater transformation, because we would be pushing against nature itself. Success would be unprecedented, biologically speaking. It would be a reverse Copernican Revolution, showing that humankind is exempt from natural processes that govern all other species. But might we be able to do exactly that? Might Margulis have got this one wrong? Might we indeed be special?
Is it so unlikely that our species, a congeries of changelings, would be able to transform our lives to meet new challenges?
Consider, again, Robinson Crusoe. He was a slaver — but also, in the end, he had a special spark. Confronted with a threat to his survival, he changed his way of life, root and branch, to meet it. Working alone, he transformed the island, enriching its landscape. And then, to his surprise, he realized that he “might be more happy in this Solitary Condition, than I should have been in a Liberty of Society, and in all the Pleasures of the World.”
Living alone on a large, biologically rich island, Crusoe was able to take as many of its resources as he wanted — he was, so to speak, barely past the first inflection on Gause’s curve. Margulis’s presumption is that if he and the mutineers had stayed, they eventually would have hit the second inflection point and wiped themselves out. (I am making the unrealistic assumption that they would not have left the island.)
Wizards and Prophets both believe that Margulis is wrong — that Crusoe and the others would have gained enough knowledge to save themselves. They would have either used this knowledge to create technology to soar beyond natural constraints (as Wizards hope) or changed their survival strategy from expanding their presence to living in a steady-state accommodation with what the island offered (as Prophets wish).
Of course, Crusoe was a fictional character (though Alexander Selkirk, the castaway whose story apparently inspired Defoe, was not). And the challenge facing the next generation is vastly larger than Crusoe’s challenge. But is it so unlikely that our species, a congeries of changelings, would be able to do exactly as Crusoe did — transform our lives to meet new challenges — before we round that fateful curve of the second inflection point and nature does it for us?
I can imagine Margulis’s response: You’re imagining our species as some sort of big-brained, hyper-rational, cost-benefit-calculating computer! A better analogy is the bacteria at our feet! Still, Margulis would be the first to agree that removing the shackles from women and slaves has begun to unleash the suppressed talents of two-thirds of the human race. Drastically reducing violence has prevented the waste of countless lives and a staggering amount of resources. Is it really impossible to believe that we wouldn’t use those talents and those resources to draw back before the abyss?
Our record of success is not that long. In any case, past successes are no guarantee of the future. But it is terrible to suppose that we could get so many other things right and get this one wrong. To have the imagination to see our potential end, but not have the imagination to avoid it. To send humankind to the moon but fail to pay attention to the earth. To have the potential but to be unable to use it — to be, in the end, no different from the protozoa in the petri dish. It would be evidence that Lynn Margulis’s most dismissive beliefs had been right after all. For all our speed and voraciousness, our changeable sparkle and ash, we would be, at last count, not an especially interesting species.
Read more from Breakthrough Journal, No. 8
Featuring pieces by Steven Pinker, Varun Sivaram,
Jonathan Symons, Tisha Schuller, Jenny Splitter,
and Ted Nordhaus.
 Darwin argued that the fossil record was then too incomplete to show transitions, and that later discoveries would fill in the gaps. Almost all scientists believe he was correct. Since then, paleontologists (dinosaur specialists) have uncovered many “missing link” species. One example: Kulindadromeus zabaikalicus, discovered in 2014. A small, bipedal dinosaur, it has both bird-like feathers and dinosaur-like scales, showing how dinosaurs evolved into birds.