Where’s my mastodon?
That’s the question I’ve been asking ever since I learned that the beasts existed and perished not so very long ago. Exactly why they disappeared depends on who you ask. Some experts point to dramatic climate shifts at the end of the Pleistocene that pared back the elephant’s favored habitat. Dissenting opinion convicts human depredation, invoking waves of voracious people who ate the world’s megafauna out of existence as Homo sapiens moved out of Africa and beyond. And while there is sometimes compromise between these views - climate change destabilizing ecosystems, for example, which may have made human activity more dramatic in its effects - the fact that we’re ramping up to a sixth mass extinction crisis has often been used as part of an ecological morality tale in which humanity has been a blight on the world’s biodiversity from the end of the Ice Age until today.
This isn’t isolated or obscure academic debate. Our opinions about what killed Ice Age megafauna have played a key part in discussions over Pleistocene rewilding - bringing Asian elephants to North America to stand in for mammoths, for example - and ballyhooed reports of cloning or other forms of de-extinction. If humans were responsible for the disappearance of these animals and the ecological connections those species fostered, the argument goes, then we have a responsibility to bring them back. And perhaps that is so. But it’s also worth investigating how even the idea of overkill - whether it’s a good fit for the pattern or not - has influenced scientific realms which in turn suggest policy and ethical obligations to global ecology. That’s exactly what archaeologists Lisa Nagaoka, Torben Rick, and Steve Wolverton consider in an analysis of “The overkill model and its impact on environmental research.”
The question of what happened to our Ice Age megafauna does not fall under the purview of a single discipline. It’s a mystery at the intersection of various sciences, as disparate as archaeology, anthropology, ecology, zoology, paleontology, climatology, botany, and more. And given that facts do not stand alone but are interpreted through theory, it’s little wonder that practitioners of varying sciences will have different views. So, to track how various sciences have responded to the idea of Pleistocene overkill, Nagaoka and colleagues followed the citation record of Paul Martin - up until his death in 2010, the principal promoter of the idea that humans drove Pleistocene megafauna to extinction.
Nagaoka and coauthors primarily focused on two fields of study that, despite their connection, often have little communication and collaboration with each other - archaeology and ecology. What the researchers found was that the two disciplines have very different views of what happened as the Ice Age closed, which in turn colors the way the extinction of the mammoth and mastodon is used as a rhetorical tool in modern arguments about extinction. This is important because, despite its seemingly widespread acceptance, the evidence for hungry hungry humans depleting large Pleistocene animals is not only contentious, but often lacking. “The reality is,” Nagaoka and coauthors write, “that the argument uses a series of untested assertions about human-environment interactions” and direct evidence of definitive hunting by Pleistocene humans is very rare despite a rich Ice Age fossil record.
So what does the comparison of the sciences show? Within archaeology, the role humans played in the end Pleistocene extinctions is an open question. Drawing from a survey of 91 archaeologists, as well as the citation search, Nagaoka and colleagues found that the majority of archaeologists sampled did not think that humans were the only, or even the primary, cause for the extinctions. Climate change was mentioned most often, with humans providing additional or secondary pressure in the form of hunting or altering the landscape. Among most archaeologists, who focus on the habits of people through time, the blame for the loss of Megatherium and Smilodon does not rest on humans alone. And even though there are problems with the climate change hypothesis and others, research written and cited by archaeologists is much more likely to recognize that there is a debate to be had and that investigations are ongoing.
The picture is very different in ecology, and has gotten far more media play through books like The Sixth Extinction and highly-publicized events concerning de-extinction. The citation record is some help here. While archaeologists are more likely to cite Paul Martin’s earlier works of overkill - which focused primarily on North America and human movements through the continent - ecologists are more likely to cite his later works in which the model is global. On top of that, the researchers found, ecological papers were more likely to use Martin’s hypothetical scenario as evidence for the argument that humans wiped out megafauna rather than as a reference to the idea.
What’s wrong with this paper trail is that many of Martin’s untested assumptions - that megafauna were naive to invading humans, and that human dispersal across the world explains the distribution of modern megafauna - are often stated as facts. This isn’t helped by an interdisciplinary communication breakdown, as Nagaoka and colleagues call it out. Some of this is as simple as where experts publish. Critics of the overkill hypothesis, or those who see humans as one of several pressures leading to Pleistocene extinction, often publish in archaeological journals or those concerned with the latter part of the Cenozoic. Papers supporting the overkill hypothesis, however, are often published in broader scientific journals and have gotten plenty of additional publicity as being citations for debates over Pleistocene rewilding and de-extinction, thus more likely to be taken as a sign of consensus by ecologists when no such consensus exists.
One would hope that the process of science would help correct this. Archaeologists or paleobiologists could publish their investigations and critiques in ecological journals, Nagaoka and colleagues suggest, but the peer review process is uneven such that ecologists are more likely to listen to other ecologists - who are already predisposed to the overkill idea - than experts from outside fields. This is strange, Nagaoka and coauthors write, given that archaeology is the science concerned with people through time and what they were doing. Wouldn’t that insight and information be useful in determining whether or not humans were responsible for driving sabercats and ground sloths into extinction? For example, that despite a very rich Pleistocene fossil record there are only a handful of associations between humans and megafauna that can be taken as evidence of hunting? Some of the strongest proponents of overkill aren’t actually reading or citing the literature bearing directly on the subject.
This situation is hard to change, particularly because we see the awful influence of human activity on biodiversity today. That humans started this pattern in the Ice Age thus becomes a political position, and to question is sometimes treated as if the critic were denying the modern extinction crisis. Still, the fact of the matter is that overkill is an untested, unverified hypothesis that has nevertheless gained sway, with guilt over humanity's appetite for destruction driving the case for ecological penance. Whether or not humans actually sparked a global extinction crisis in the Pleistocene has become almost irrelevant in conservation communication because of the argument’s rhetorical value. “When overkill is used as a cautionary tale and a means to rally support for environmentalism, it portrays humans as a destructive species,” Nagaoka and colleagues write, apparently not through what we choose to do but because it’s inherent to our nature. It’s a dark, deterministic view of our species. More than that, this view ignores cultural diversity across time and space, treating humans as homogeneously voracious and destructive, an insult justified through flimsy correlation.
Even if overkill were eventually found to be a real and significant global phenomenon during the Pleistocene, Nagaoka and colleagues write, there is more to the story than a cautionary tale or ecological guilt trip. One of the options, they write, is that overkill offers information about different ways human cultures have interacted with the environment - in what times and places were people more destructive as opposed to more concerned with sustainability? - and that such understanding helps us better appreciate how we are intertwined with nature instead of separating ourselves as a destructive force outside of it. It’s not simply that Pleistocene overkill is not supported by evidence. It’s that the concept divides us from nature and makes us villains, perhaps irredeemable ones. We can do better.