July 7, 2014 | 2
There’s no doubt that humans have drastically changed the Earth. The global scale impacts of humans on the environment has led many scientists, scholars, and environmentalists to use the term Anthropocene to describe our present geological period. The term is currently in widespread — though still informal — use, and scientists are actively debating whether we can officially say that we live in the “Human Age” and if we do, when exactly it started.
By some definitions, the Anthropocene began at the dawn of agriculture, when humans began altering natural ecosystems by tilling fields and planting crops. Others place the start of the Anthropocene to 2000 years ago, as large cities blossomed and regional empires expanded their reach around the world. Most often it’s the Industrial Revolution that marks the start of the Anthropocee, due to the explosion in fossil fuel use and the subsequent increase in population, consumption and pollution.
Regardless of the exact starting point, many agree that the time since World War II has marked a significant expansion in human effects on the environment. The end of the war itself left its mark on the geological record, with the deposition of radioactive material from the atomic blasts of 1945. After that, the postwar boom in population and new technologies (and thus pollution and environmental impact) has been termed “The Great Acceleration”. By dozens of measures of environmental impact, a steep upward trajectory began around 1950 and has continued upwards ever since.
During the Anthropocene we’ve altered the land, the air, the water, and the biosphere, drastically impacting the biodiversity of every ecosystem. While it’s the furry species of extinct and endangered creatures that get the most popular attention, there is growing concern for the effects of the Anthropocene on the less cute and often invisible diversity of microbes. A brand new (open access) paper in the journal Anthropocene looks at the different phases of the Anthropocene from a microbial point of view. The authors review the changes to the human microbiome after the dawn of agriculture, the shift to urban life during the Industrial Revolution, and the Great Acceleration and the resulting change to a more industrialized diet. They look at other factors impacting microbes such as the growth of antibiotic resistance and the spread of disease, how industrial agriculture, chemical fertilizers, and increased temperatures alter soil microbiodiversity, and how ocean acidification alters the populations of algae and plankton.
These are big changes for small organisms, with big impacts on human and environmental health. We have unintentionally altered the genomes of individual bacteria and the structure of their collectives. We’re unconsciously running global scale experiments on microbial evolution and antibiotic resistance. However, the most striking aspect of the paper wasn’t cataloging these unintended effects of human technology on bacteria, but rather how intentional DNA manipulation might someday become a major player in the course of evolutionary history.
DNA sequencing and synthesis technologies might be used to create new microorganisms with potentially far-reaching environmental effects, thus contributing to the acceleration of the Anthropocene. But these technologies might also be used to mitigate and deal with some of the other consequences of anthropogenic climate change: by producing materials more sustainably, creating new antibiotics, and performing other ecosystem services. For their potential importance in both contributing to and dealing with the Anthropocene, the authors propose that the start of the Great Acceleration shouldn’t be 1945 or 1950, as most others have proposed, but 1953 — the year that Watson and Crick published the double helix structure of DNA.
This is a surprising proposal, because though they are increasingly powerful, DNA technologies haven’t created the same scale of global environmental impact as something like the steam engine. (At least not the microbial genome-scale synthetic biology that they mention in the article; genetic technologies in agriculture have certainly made big changes in land use, industrial farming practices, and other relevant Anthropocene measures). The hype surrounding novel biotechnological tools, however, frequently asserts that biology will enable technological solutions to the global problems that other technologies have caused. From algae that produce biofuels to bacteria that eat pollution to crops that can tolerate drought, biotechnologies are frequently defined by their potential to someday solve the ecological crises of the Anthropocene.
Perhaps rather than backdating the relevance of this sort of biotech, we might consider how this hype might lead us instead to a new phase of the Anthropocene. Now that we are coming to terms with the term “Anthropocene” and are proposing new technologies specifically to combat the problems caused by other technologies, the Great Acceleration might soon turn into a Great Technological Problem Feedback Loop. We might be setting ourselves towards a bio-techno-evolutionary arms race, where we design new technology that has an unforeseen impact on living things, then we design other problem-solving biotechnologies that have their own potential problems, and on and on.
In Through the Looking Glass the Red Queen tells Alice that “it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!” As we run to outpace technological problems of the past, are we creating the Red Queen’s Anthropocene of the future? This doesn’t mean that we should stop trying to solve the very real problems of the Anthropocene, but perhaps we need to think more about what it might mean to technologically “run twice as fast” to finally get somewhere else.