Sometimes it seems the clouds over climate science just won't lift. Computer models of Earth's climate have multiplied in number, complexity, and computational power, yet they remain unable to answer more precisely some of the questions most on the public's mind: How high must we build sea walls to last until 2100? How bad will heat waves get in the next decade? What will Arctic shipping routes look like in 2030? Climate models all agree that global temperatures will continue to rise in response to humanity's greenhouse gas emissions, but uncertainties stubbornly persist over how quickly that will happen and how high temperatures will go.
Tapio Schneider, a German-born climate dynamicist at the California Institute of Technology (Caltech) in Pasadena, believes climate science can do better. And he's not alone. Later this summer, an academic consortium led by Schneider and backed by prominent technology philanthropists, including former Google CEO Eric Schmidt and Microsoft co-founder Paul Allen, will launch an ambitious project to create a new climate model. Taking advantage of breakthroughs in artificial intelligence (AI), satellite imaging, and high-resolution simulation, that as-yet-unnamed model—the Earth Machine is one candidate—aims to change how climate models render small-scale phenomena such as sea ice and cloud formation that have long bedeviled efforts to forecast climate. A focus will be on the major source of uncertainty in current models: the decks of stratocumulus clouds that form off coastlines and populate the trade winds. A shift in their extent by just a few percentage points could turn the global thermostat up or down by a couple of degrees or more within this century—and current models can't predict which way they will go.
Within 5 years, the team hopes its AI-fortified model will drive out that uncertainty and others by learning on its own how clouds behave, from both actual observations and purpose-built cloud simulations. It's a lofty goal, Schneider admitted late one May afternoon in sun-soaked Pasadena, sitting outside with his newly assembled team. They had just wrapped up a workshop, the third he had convened in the past year, bringing together leading climate scientists and engineers to discuss the future of their field. "We're under no illusions," Schneider said. "This is not going to be a cakewalk."
There are reasons for skepticism. The United States already has many climate models, and some people question why it needs another, further dividing resources. Others question the technology and wonder whether the philanthropists backing the project have given it the scrutiny that an agency such as the National Science Foundation would provide. The team's unorthodox message and means won't make it easy to win people over, says David Randall, a climatologist at Colorado State University in Fort Collins. "I think the existing modeling centers will push back. If Tapio is getting funding, that in principle could have gone to someone else."
Climate modelers have always followed two imperatives. First, they've folded ever more features of Earth into their simulations. Models once contained only the atmosphere and ocean; now, they have subroutines for ice sheets, land use, and the biosphere. Second, they've sought higher and higher resolutions—modeling interactions on smaller and smaller scales—riding the wave of Moore's law on government-owned supercomputers. By one estimate, the computing power those models use has increased by a factor of 100 million since the 1970s. As the models grew increasingly complex, they more fully reflected the vagaries of our planet—unknown unknowns turned to known unknowns. Yet the uncertainties remained.
At their most basic, all the models work the same way: They take the globe and chop it into a mesh, with cells some 25 kilometers to 50 kilometers on a side, and use a set of code called a dynamical core to simulate the behavior of the atmosphere and ocean over years and centuries. But much of what happens on the planet—cloud formation, for example—arises at scales smaller than those grids. Therefore, those phenomena have to be described indirectly—"parameterized" in the jargon of climate science—with rule-of-thumb equations. The modelers then adjust those various knobs to best represent the world as they know it—a process called tuning. "It's a mix of intuition and empiricism and some physically observed laws," says Isaac Held, Schneider's mentor and a scientist at the Geophysical Fluid Dynamics Laboratory, a prominent modeling center in Princeton, New Jersey.
Make no mistake: Current models do an admirable job of re-creating the world. But their shortcomings drive scientists bonkers. They struggle to re-create Arctic temperatures and melting sea ice. Their distribution of rainfall is off, biased against the extreme torrents that can cause flooding. "The rain is falling in the wrong place and at the wrong rate," says Paul O'Gorman, an atmospheric scientist at the Massachusetts Institute of Technology (MIT) in Cambridge, who formerly worked with Schneider. And, especially important, the models often fail to simulate those thick stratocumulus clouds, which typically form off the coasts of the western Americas and help cool the region.
Schneider, 46, has not always been fixated on clouds. Early in his career at Caltech, he focused on large-scale atmospheric flows, such as the Hadley cell. That atmospheric conveyor belt shifts air from the equator to the subtropics—the type of pattern that climate models can simulate using simple laws of physics. But while on an appointment at ETH Zurich in Switzerland, he became increasingly convinced that climate models could do a better job integrating new data on cloud behavior. He returned to Caltech in 2016 to seek a solution, adding a joint appointment at NASA's Jet Propulsion Laboratory (JPL) in Pasadena, where he had become a close collaborator with one of JPL's cloud gurus, João Teixeira.
That was the start of what is now a collaboration of about two dozen people. AI, particularly a variant called machine learning, was on the upswing, and Schneider and Teixeira mused that it might help with the cloud problem. Soon they recruited Andrew Stuart, a soft-spoken computational mathematician at Caltech. The team found additional recruits at JPL, which has a vast archive of satellite data on clouds, and across the country at MIT, where researchers had built an ocean model infused with every possible satellite and buoy measurement of the seas.
The MIT group had ambitions to go bigger, and its members welcomed Schneider's overture. "Always the idea was to go to an Earth system model," says MIT physical oceanographer Raffaele Ferrari. "But the atmospheric community wasn't particularly primed to think the same way."
At first, the nascent collaboration was not set on creating a new climate model; the United States already has six prominent models. "It was more a question of how can we build a better model," Schneider says. But they wanted to be certain that a full climate model would incorporate their innovations. They decided the best way would be to build a new model, albeit one starting with existing code. Doing so meant they needed a computation whiz who could take their equations and make them run on a next-generation supercomputer.
A U.S. Navy expert reported for duty. Frank Giraldo, an applied mathematician at the Naval Postgraduate School in Monterey, California, is behind the Navy's new dynamical core, the mathematical engine at the heart of its next-generation weather and climate models. His core, the Non-hydrostatic Unified Model of the Atmosphere, is designed from the ground up for modern parallel computing. The core is also flexible and self-contained. It can solve equations to various degrees of accuracy in the same model, which should allow the Earth Machine to give a low-resolution overview of the planet while zooming in on clouds in real time.
Crucial input to the new model will come from simulations that have recently painted a much sharper picture of the low clouds and how they behave. Called large eddy simulations (LESs), those models trade the global scale and centuries-long time horizons of a climate model for narrow scope and high resolution. The models re-create several days in the life of small parcels of the atmosphere, with cells only 10 meters on a side. At such resolutions, key aspects of cloud formation—such as the convection that lofts sun-heated air upward until the water vapor it carries condenses into clouds—arise directly from physical laws. The results sometimes closely resemble reality, says Chris Bretherton, a leading cloud scientist at the University of Washington in Seattle.
Several years back, Bretherton led a project that used LESs to study how a 2°C temperature rise affected low ocean clouds. Two feedbacks emerged, both of which would exacerbate warming. First, higher temperatures appeared to allow more dry air to penetrate thin clouds from above, preventing them from thickening and reflecting more of the sun's energy. Second, increased carbon dioxide (CO2) levels trap heat near the clouds' tops, preventing their cooling. Because such cooling drives the turbulence that forms clouds, the effect could impede cloud formation, fueling further warming. If CO2 emissions continue unabated, Bretherton says, "It's possible that most of our low clouds in the tropics would melt away."
Other evidence, including actual cloud observations, also suggests "that the low-cloud feedback is positive and that low clouds will amplify climate warming," says Stephen Klein, an atmospheric scientist at Lawrence Livermore National Laboratory in California.
Those breakthroughs have not yet made their way into global models because no bridge, or technical way to get them there, has emerged. But Schneider's team is building one: an LES that can simulate cloud behavior over days within a domain of up to 100 kilometers on a side—about the size of one cell in a climate model. Their LES is based on a Caltech-developed model called the Python Cloud Large Eddy Simulation (PyCLES) that focuses on low clouds. "These simulations may not be perfect," Schneider says, "but they're much, much better than anything else we have." If all goes according to plan, Giraldo's code will run 1000 PyCLES-type models on the fly as individual cells inside the Earth Machine. The machine will also use AI to study the observed and simulated clouds, extrapolating what it learns to improve the rules of thumb it uses to simulate clouds across the globe. Soon, virtual cloud decks will sprout off the California coast.
Incorporating AI into climate modeling is a work in progress. Several researchers, including Bretherton's group and Michael Pritchard, a climate modeler at the University of California, Irvine, trained one form of AI, neural networks, on high-resolution simulations of the atmosphere. They then used the AI to replace several classic rules of thumb, such as how quickly the temperature and humidity change in rising air. "All of these are in the feeling-around type of phase," Bretherton says.
But neural networks and climate are an uneasy fit. The algorithms do best on problems such as classification—for example, learning from millions of labeled photos what a dog looks like. The code builds up an intricate model for what an object looks like that is often wholly inexplicable to human reasoning. The approach works for dogs—but may break down when it encounters something outside its training data—say, a camel. And for climate change, the future is a camel. For that reason, Stuart and Schneider are not banking on neural networks to guide the Earth Machine's AI. Instead, they seek a compromise, something between traditional rules of thumb and pure AI. They hope to develop code that can use hard-won knowledge of clouds and then fill in gaps with its learned intuition, essentially replacing the manual tuning typically done by modelers with a machine.
That learning won't be driven by individual cloud systems, which are imprinted with the atmospheric chaos that begat them. Rather, the AI will learn from seasonal or annual statistics on cloud coverage and other factors, wiping out the noise of weather. As Stuart and Schneider move each rule of thumb over to the AI's hands, they'll wire the model to calculate probabilities, allowing an overall reckoning of uncertainty not yet seen in current climate models. And, partly at the prompting of their engineering-minded funders, they'll develop metrics to gauge how accurately the model renders the world. They're betting on recent insights that, for some aspects of the climate system, short-term accuracy in a model indicates decades-long viability.
Success is far from guaranteed, the team agreed after the May workshop. "It could be that what we do ends up not improving the numbers, just to be completely scientifically honest," Stuart said. But even so, he added, the approach should spark new ideas across climate modeling. "I'd say that's the worst-case scenario," Schneider quickly interjected. "This is why I say we can't fail entirely. But I do hope we will do more than that."
As Schneider assembled his team and developed a general plan, he still faced a big question: Who would support their dream? That the U.S. government would finance yet another climate model seemed unlikely. Even before President Donald Trump's White House proposed cuts in climate science, former President Barack Obama's administration had explored whether the country needed to support so many models.
Fortunately for the Caltech team, the tech philanthropists—particularly Allen, who has already invested heavily in oceanography—were looking for something to make a splash. They sought a risky investment with a big potential payoff that could make climate forecasting more concrete. Schneider already had preliminary support for the Earth Machine from Charles Trimble, a Caltech alumnus who miniaturized the GPS receiver, and the Heising-Simons Foundation in Los Altos, California. But to reach their full ambitions, they needed more, some $5 million annually—a goal that now seems in sight, though the exact financing was still being finalized at press time.
The mix of ambition, metrics, and innovation embodied in the Earth Machine was exactly the type of work that Allen wants to fund, says Chris Emura, a Seattle, Washington-based computer engineer leading Allen's engagement with Schneider. Over the past year, Allen's team has been enmeshed in the modeling world, visiting leading centers to gauge what they can and can't do. Schneider's project, Emura says, has some audacity to it, with a high degree of "responsible risk." The team also has garnered interest from the Windward Fund, a conservation charity in Washington, D.C., that began an effort last month to support work that improves near-term practical climate predictions. And, this week, Schneider's endeavor confirmed backing from Schmidt Futures in New York City, the science-focused philanthropy of former Google CEO Eric Schmidt and Wendy Schmidt, president of the Schmidt Family Foundation in Palo Alto, California. "It's an attractive blend of conservative and bold approaches," says Stuart Feldman, the philanthropy's chief scientist.
These simulations may not be perfect … but they're much, much better than anything else we have.
As rumors of the Earth Machine have spread, the project has drawn a mix of support, envy, and skepticism. A new approach like that is desperately needed, says Trude Storelvmo, an atmospheric scientist at the University of Oslo. "This is a very welcome and innovative idea." She adds that it could bolster the case for expanded observations of clouds—necessary because NASA's current cloud satellites have worked nearly a decade longer than planned.
In contrast, Amy Clement, a cloud scientist at the University of Miami in Florida, laments the focus on building more complex models. "As a result, in my opinion, we are losing a lot of our ability to gain fundamental understanding of the climate system." However, she adds, given Schneider's acumen as a climate scientist, the model might lead to such understanding. Bretherton, meanwhile, likes the group's ambitions but questions whether a new model is needed to realize them. "We already have too many climate models in the United States," he says. "It divides our resources and makes scientific progress slower."
Other people think the project is discounting rewards that will come when existing models are pushed to run globally at higher resolutions. Much of the climate science community in Europe, for example, is invested in a proposal called Extreme Earth, which would push models to a resolution of 1 kilometer per cell. Although such code would require a network of supercomputers and wouldn't run as long as traditional models, it would also eliminate many parameters that Schneider is seeking to improve with AI, replacing them with physics. "I'm so frustrated with the idea of parameterizing these things," says Bjorn Stevens, a climate scientist at the Max Planck Institute for Meteorology in Hamburg, Germany. "What I find more exciting is getting rid of those rules of thumb."
There's also a big assumption baked into the Earth Machine: that the cloud problem can even be solved, adds Joel Norris, a cloud scientist at the Scripps Institution of Oceanography in San Diego, California. Perhaps any sort of parameterization, even one tuned by AI, cannot crack clouds to a meaningful degree. "It may be the case you can't reduce the uncertainty," Norris says. Some satellite observations essential to rendering clouds, such as the exact location of water vapor in the lower atmosphere, simply don't exist. And Schneider's team could be shocked when it sees how apparently unconnected parts of the model go awry when clouds are tweaked, Held adds. "There's just a lot of connections."
Schneider's team is aware of all those concerns and shares many of them. But the members are ambitious and have grown impatient waiting for a breakthrough. They've lived with human-driven climate change, and its dogged uncertainties, as a reality for their entire adult lives. It's time for the clouds to lift.