For the Fossil Record

5 hours ago 1

One of the most maddening problems of our era is the glut of information and what to do with it. But while web scraping and data mining have only become big business in the last twenty years—indeed Claude Shannon first theorized the concept of “information theory” only about eighty years ago—biologists and gentlemen scientists have been laboring to organize Earth’s teeming masses of living things for centuries. In the Review’s February 26 issue, Ian Tattersall reviews Jason Roberts’s Pulitzer Prize–winning Every Living Thing: The Great and Deadly Race to Know All Life, a dual biography of Carl Linnaeus and Georges-Louis Leclerc, comte de Buffon, two eighteenth-century naturalists who dedicated themselves to the quest to classifying and cataloging the “riotous diversity” of all life. As Tattersall drily observes, “The world clearly contains vastly more species of living organisms than Noah could ever have fit into his ark.”

Tattersall is a paleoanthropologist and a curator emeritus at the American Museum of Natural History. He has written prolifically on human evolution, the history of cognition, and lemurs for both academic and general audiences, in outlets ranging from Science and the Journal of Anthropological Sciences to Scientific American, The Wall Street Journal, and The New York Times. For the Review he has written about the development of symbolic thought and humanity’s first linguistic ancestors. (These could be heated topics: “As early as 1866 the Linguistic Society of Paris specifically banned discussion of the origin of language as being altogether too disruptive for the contemplative atmosphere of a learned association.”) He is also the author of more than a dozen books, including The Strange Case of the Rickety Cossack: And Other Cautionary Tales from Human Evolution (2015) and, most recently, Understanding Human Evolution (2022).

I wrote to Tattersall last week to ask about other lost titans of science, lessons from lemurs, and the future of human history.


Anika Banister: Are there any seemingly lost figures of science whom you would like to champion, as Roberts has rehabilitated the fallow reputation of Buffon? (Or is there anyone you wish would be lost to time?)

Ian Tattersall: Most scientists labor and vanish in obscurity despite making valuable contributions to the scientific enterprise, so there are many figures to choose from here. But in the arena of evolutionary biology there are two in particular who, while reviled or ignored in their own day, laid the basic groundwork for the emergence of the Darwin–Wallace theory of evolution in the mid-nineteenth century. One is the French naturalist Jean-Baptiste Pierre Antoine de Monet, chevalier de Lamarck (1744–1829), who makes a cameo in Roberts’s book. Around the turn of the nineteenth century, Lamarck concluded that some of the lineages of fossil mollusks he identified in the rocks of the Paris Basin had not simply remained as the Creator had made them but instead showed steady change over time. Lamarck was the first Western scientist to seriously question the fixity of species, an essential first step toward the notion of Darwinian evolution. Yet he has by and large failed to receive proper recognition for this fundamental insight. Instead, posterity has pilloried him for his proposed agent of change: the “inheritance of acquired characteristics,” whereby parents are supposed to pass along to their offspring physical novelties they have acquired during their lifetimes. The most famous thought experiment in this area envisions successive generations of giraffes elongating their necks through prolonged striving to browse ever higher in the trees. This view was actually uncontroversial in Lamarck’s day, and it has in a sense been resuscitated in highly attenuated form by the rise of epigenetics, which studies how small-scale DNA structural modifications due to environmental influence can turn genes on and off. But it is still no compliment to be called a Lamarckian.

Despite his undeservedly poor reputation, Lamarck’s name remains familiar today. The second figure I’d like to mention, the Italian geologist Giovanni Battista Brocchi (1772–1826), richly deserves to be rescued from almost total obscurity. When Brocchi studied fossil shells from the Apennine Mountains (formerly the floor of a shallow sea) he, too, observed faunal change. But as early as 1814 he recognized that the individual species in his collection had remained more or less stable in form even as the overall fauna changed over time, and he speculated that those species not only had distinct origins and life spans and extinctions, but had been capable of giving rise to distinctive descendants.

It is not clear whether Charles Darwin knew of Brocchi’s work, though he might well have. But, taken together, Lamarck’s ideas of gradual change and Brocchi’s splitting of lineages provided the necessary ingredients for what would much later become the theory of evolution, which has at its center the notion that, however much they may have diversified with the passage of the eons, all living things are united by descent.

As for the rogues one wishes would go away, for a paleoanthropologist there is one outstanding choice. Charles Dawson (1864–1916) was a prolific forger of antiquities who (with or without assistance) was responsible for “Piltdown Man,” a supposedly ancient human fossil from southern England that actually combined parts of a modern human cranium with a broken ape jaw. Some paleoanthropologists immediately denounced this bizarre anomaly, and virtually everyone had fenced it off within two decades of its announcement in 1912. But Piltdown Man nonetheless hung around until it was definitively disproved in 1953; even now it refuses to be forgotten. Most damagingly, this long scientifically dismissed, fraudulent chimera is still regularly pounced on by creationists as “proof” of paleoanthropological disingenuousness.

“Cognitive dissonance is baked into the human condition,” you write, apropos Linnaeus’s belief that epilepsy is caused by washing one’s hair. What do you think is the most pernicious absurdity in science now, and what is the best remediation we can hope for?

As a matter of sheer consequence, the clear winner in the pernicious absurdity stakes is climate-change denial. Several decades have passed since scientists sounded the first warning bells, and the past few years have come with record-breaking droughts, wildfires, extreme weather events, ice-sheet loss, melting permafrost, sea-level rises with coastal flooding, and so much more. We know well which human activities lie behind these predictable and disastrous phenomena that signal a trajectory toward an uninhabitable planet. Yet an alarming number of people appear to believe that if we ignore these signs they will go away or prove to be “normal” climatic fluctuations, or even that they are just illusions. Worryingly, those attitudes also seem to reflect a more general distrust of the science that underpins our modern way of life. As for the future, attitudes toward climate change (and science) seem to be becoming badges of political identity. Witness our government’s recent retraction of the “endangerment finding” that greenhouse gases and their consequences put public health and welfare at risk. This supremely irresponsible measure does not give one much hope for remediation in any effective time frame.

In The Strange Case of the Rickety Cossack, you wrote that “if you want to know how your own ancestors lived and functioned early in the Age of Mammals, it is to lemurs you have to turn.” What lessons from your early field work with lemurs do you most want to impart to the lemur-illiterate?

Lemurs are unique to the island of Madagascar, where I initially went to study fossils of recently extinct species. But once there I took the opportunity to observe the surviving lemurs in their natural habitats—and it was love at first sight. Those primates are incredibly charismatic, and I immediately wanted to know more about them at a time when little was known. I would happily still be a lemurologist today, but for reasons beyond my control this didn’t work out, and I found myself once again studying human evolution.

But I returned to paleoanthropology with an entirely altered perspective. The existence of only one human species in the world today has often tempted paleoanthropologists to think (wrongly) that this is the natural order of things, and that their job is therefore to project that one species back into the past in as straight a line as possible. There is, in contrast, a profusion of lemurs: there are five entire families of them, and by one rather extreme count well north of a hundred species. Studying them made me keenly conscious of just how much variety is out there in the living world; fortunately, this new awareness transferred beautifully to my research on what has turned out to be a notably diverse human fossil record. Far from being a story of continuous directed improvement, human evolution has involved a lot of trial and error as multiple hominin species stepped on and off the evolutionary stage over the past seven million years.

In your book The Monkey in the Mirror: Essays on the Science of What Makes Us Human, you write about the inertia of scientific thought: “What one is first taught on a subject tends to have much more influence than whatever one hears subsequently about the same topic.” Are there any ideas from early in your career that proved durable? Or any that you reluctantly had to change?

Teaching students with varied educational backgrounds showed me how difficult it often is to question ideas learned early on. Fortunately, my own quite privileged experience was a little different. As an undergraduate, and then again as a graduate student, I was drilled in the orthodoxies of the New Evolutionary Synthesis, the dominant perspective on evolutionary process during the mid-twentieth century. The Synthesis saw evolution as steady change over time, under the guiding hand of natural selection. Time and change were more or less synonymous, and evolution was slow, gradual, and continuous as, generation by generation, better-adapted individuals out-reproduced the less well endowed. In its reductionist simplicity this idea of natural selection made for a seductive story; a century earlier its “why didn’t I think of that?” appeal had hugely helped Darwin sell his core idea of evolution as “descent with modification.”

But my intellectual environment radically changed when I moved to the American Museum of Natural History, where a revolution in evolutionary biology was underway as my new colleagues Niles Eldredge and Stephen Jay Gould (the latter of whom had recently departed for Harvard) argued that what was observed in the fossil record didn’t fit the pattern of slow change predicted by the Synthesis. Rather, the fossils seemed to be indicating that species (as Brocchi had realized) tend to linger in the fossil record basically unchanged, until they are abruptly replaced by others. Non-change seemed to be the order of the day, interrupted by short bursts of innovation marked by the appearance of new species.

That was not what I had been primed to perceive, but I was rapidly convinced of the general application of the new model when Niles and I tackled the human fossil record and found that it, too, lacked any strong signal of slow, steady change. What’s more, I eventually came to understand that natural selection, while a mathematical certainty in any population in which more individuals are born than survive to reproduce, most commonly acts as an agent of stability rather than of change. It mainly keeps entire populations fit for their environments by trimming off their most unsuitable variants.

So yes, I have had the experience of doing an intellectual U-turn. But happily that turn was made neither reluctantly nor with particular difficulty, because I made it in a stimulating new intellectual environment and by interacting with some extraordinary people. I was really lucky.

You open your essay with the dictum, “We human beings are brilliant at ignoring uncomfortable truths.” Are there any uncomfortable truths emerging today in the field of paleoanthropology (or anthropology) that are facing or may face resistance?

We Homo sapiens are a pretty egotistical bunch, and accordingly we like to think of ourselves as the end product of a selective process that has exquisitely perfected us over the eons. That’s why we lap up pop-evolutionary explanations of why we cheat on our spouses or crave high-calorie fats. It’s also why many paleoanthropologists believe that such complex human features as our unique cognitive style must have deep roots in time. In the end, though, I have had to conclude that the language and symbolic thought that give us human beings such a sense of superiority over the rest of nature were both recently and adventitiously acquired, in an event that was far too sudden to have been driven by slow natural selection. We are uncomfortably accidental!

Linnaeus and Buffon were working with considerably fewer tools than we have today. How have new approaches changed the work of taxonomists? Is there a future in which morphology becomes obsolete in mapping evolutionary history?

No question, taxonomy today is vastly different from what it was in the eighteenth century. But I fervently hope—and expect—that morphology will always have a place in it. The main problems taxonomists face today are procedural ones, relating to how we can best integrate all the many different kinds of information that we can now collect in addition to morphology. One can see a future (and hopefully disciplined) use for AI in all of this. But we will nonetheless still be asking the same basic question that we have asked from the very beginning: How, exactly, is nature organized?

Read Entire Article