Just after World War II, when the atomic bombs fell and our thirst for coal and oil became a full-blown addiction, Earth entered the Anthropocene, a new geologic time when humanity’s environmental reach left a mark in sediments worldwide. That’s the majority conclusion of the Anthropocene Working Group, a collection of researchers that has spent the past 7 years quietly studying whether the term, already popular, should be submitted as a formal span of geologic time.
After tallying votes this month, the group has decided to propose the postwar boom of the late 1940s and early 1950s as the Anthropocene’s start date. The group will ask the International Commission on Stratigraphy (ICS), the bureaucracy that governs geologic time, to recognize the Anthropocene as a series, the stratigraphic equivalent of an epoch, on par with the Holocene and Pleistocene that preceded it. Colin Waters, the group’s secretary and a geologist at the British Geological Survey in Keyworth, will reveal the group’s recommendations on 29 August at the International Geological Congress in Cape Town, South Africa.
The group won’t submit a formal proposal yet. To do so, it must gather multiple cores of sediment from around the planet and show that they contain a sharp transition in geochemical tracers that is likely to persist as a permanent part of the rock record; the core with the best example of the transition would then serve as a “golden spike,” marking the Anthropocene’s start. These cores could come from lakebeds, ocean floors, ice sheets—or even corals or tree rings. But they must capture the “Great Acceleration”: the postwar period when fossil fuel combustion took off, says Jan Zalasiewicz, a geologist at the University of Leicester in the United Kingdom who convened the group. “We’ll go and get our hands dirty, beginning to look for sections that we can formally propose.”
Those sections will have to be rich with multiple signatures, as the Anthropocene proposal faces deep skepticism from stratigraphers. “The voting members of the International Commission on Stratigraphy look at these things critically,” says Stan Finney, chair of ICS and a geologist at California State University, Long Beach.
He and other stratigraphers doubt that their standards can be properly applied to decades-old mud and silt rather than the solid rock that records older stratigraphic boundaries. They question the value of the Anthropocene for their science, which seeks to draw coherent chronologies out of sedimentary rocks. Some also resent the role that scientists from other disciplines such as climate science have played in driving the proposal and see it as a political statement.
Should ICS decide against the Anthropocene, some stratigraphers fear, they could be swamped with bad press. “I feel like a lighthouse with a huge tsunami wave coming at it,” Finney says. Phil Gibbard, a stratigrapher at the University of Cambridge in the United Kingdom and a working group member who voted against the proposal, also worries about a backlash. “We’re nervous,” he says.
The working group, a mix of 35 geologists, climate scientists, archaeologists, and others, considered multiple dates. There were votes for an early start to the Anthropocene, 7000 years ago, when humanity began converting forests en masse to pastures and cropland, perhaps causing carbon dioxide (CO2) to spike, and also for 3000 years ago, when lead smelting tainted the ground. More recently, they considered 1610, when pollen from the New World appeared in Europe, and the early 1800s, the start of the Industrial Revolution. But the most votes went to the Great Acceleration.
The group’s decision to go for a single, recent start date for the Anthropocene disappoints Bill Ruddiman, an emeritus professor of environmental science at the University of Virginia in Charlottesville. “It is a mistake to formalize the term by rigidly affixing it to a single time,” he says, “especially one that misses most of the history of the major transformation of Earth’s surface.” Many archaeologists also favor the 7000-year-old date, when early humans began to alter the planet’s surface. But the working group was looking for a signature of global, human-driven change that would wind up in the rock record, not the first traces of human influence on the local landscape.
In a study published in Science earlier this year, the working group highlighted their most likely proxies. Materials that rose to mass use in the 1950s, such as plastics and elemental aluminum, are prime targets. Plutonium from atmospheric nuclear testing, first visible in the soil in 1951, will linger in sediments globally for the next 100,000 years as it decays into uranium and then lead. But perhaps the most promising proxy comes from recent work that has shown, across 71 lakebeds worldwide, a 1950s spike in fly ash residue from the high-temperature combustion of coal and oil. “This is a permanent signal,” Waters says. “These particles will not be degrading.” He adds that the ash is directly tied to the human-driven increase in CO2 that sparked the notion of the Anthropocene in the first place.
Until this year, the group had not sought a golden spike; instead, they favored defining the Anthropocene simply by a starting date, a method stratigraphers have only used for units of time within the Precambrian, more than 540 million years ago, when clear dividing signals have been impossible to find in the rock. But several ICS members, including Finney, made it clear a golden spike would be necessary for any chance of approval.
The group’s proposal may not satisfy him. But Waters hopes the ICS will consider the chosen golden spike on its merits. “Just because it’s thin and short duration, the fact that it’s very sizable is the most important thing,” he says.
Zalasiewicz adds that the marks of the Great Acceleration will endure, even if somehow humanity reverses global warming and gives half the planet over to conservation. And if humanity doesn’t change course, then future stratigraphers might need to elevate the Anthropocene’s rank in the geological hierarchy, he says. “An epoch would be thinking too small.”