Read our COVID-19 research and news.

Emulators speed up simulations, such as this NASA aerosol model that shows soot from fires in Australia.

NASA

From models of galaxies to atoms, simple AI shortcuts speed up simulations by billions of times

Modeling immensely complex natural phenomena such as how subatomic particles interact or how atmospheric haze affects climate can take many hours on even the fastest supercomputers. Emulators, algorithms that quickly approximate these detailed simulations, offer a shortcut. Now, work posted online shows how artificial intelligence (AI) can easily produce accurate emulators that can accelerate simulations across all of science by billions of times.

“This is a big deal,” says Donald Lucas, who runs climate simulations at Lawrence Livermore National Laboratory and was not involved in the work. He says the new system automatically creates emulators that work better and faster than those his team designs and trains, usually by hand. The new emulators could be used to improve the models they mimic and help scientists make the best of their time at experimental facilities. If the work stands up to peer review, Lucas says, “It would change things in a big way.”

A typical computer simulation might calculate, at each time step, how physical forces affect atoms, clouds, galaxies—whatever is being modeled. Emulators, based on a form of AI called machine learning, skip the laborious reproduction of nature. Fed with the inputs and outputs of the full simulation, emulators look for patterns and learn to guess what the simulation would do with new inputs. But creating training data for them requires running the full simulation many times—the very thing the emulator is meant to avoid. 

The new emulators are based on neural networks—machine learning systems inspired by the brain’s wiring—and need far less training. Neural networks consist of simple computing elements that link into circuitries particular for different tasks. Normally the connection strengths evolve through training. But with a technique called neural architecture search, the most data-efficient wiring pattern for a given task can be identified.

The technique, called Deep Emulator Network Search (DENSE), relies on a general neural architecture search co-developed by Melody Guan, a computer scientist at Stanford University. It randomly inserts layers of computation between the networks’ input and output, and tests and trains the resulting wiring with the limited data. If an added layer enhances performance, it’s more likely to be included in future variations. Repeating the process improves the emulator. Guan says it’s “exciting” to see her work used “toward scientific discovery.” Muhammad Kasim, a physicist at the University of Oxford who led the study, which was posted on the preprint server arXiv in January, says his team built on Guan’s work because it balanced accuracy and efficiency.

The researchers used DENSE to develop emulators for 10 simulations—in physics, astronomy, geology, and climate science. One simulation, for example, models the way soot and other atmospheric aerosols reflect and absorb sunlight, affecting the global climate. It can take a thousand of computer-hours to run, so Duncan Watson-Parris, an atmospheric physicist at Oxford and study co-author, sometimes uses a machine learning emulator. But, he says, it’s tricky to set up, and it can’t produce high-resolution outputs, no matter how many data you give it.

The emulators that DENSE created, in contrast, excelled despite the lack of data. When they were turbocharged with specialized graphical processing chips, they were between about 100,000 and 2 billion times faster than their simulations. That speedup isn’t unusual for an emulator, but these were highly accurate: In one comparison, an astronomy emulator’s results were more than 99.9% identical to the results of the full simulation, and across the 10 simulations the neural network emulators were far better than conventional ones. Kasim says he thought DENSE would need tens of thousands of training examples per simulation to achieve these levels of accuracy. In most cases, it used a few thousand, and in the aerosol case only a few dozen.

“It’s a really cool result,” said Laurence Perreault-Levasseur, an astrophysicist at the University of Montreal who simulates galaxies whose light has been lensed by the gravity of other galaxies. “It’s very impressive that this same methodology can be applied for these different problems, and that they can manage to train it with so few examples.”

Lucas says the DENSE emulators, on top of being fast and accurate, have another powerful application. They can solve “inverse problems”—using the emulator to identify the best model parameters for correctly predicting outputs. These parameters could then be used to improve full simulations.

Kasim says DENSE could even enable researchers to interpret data on the fly. His team studies the behavior of plasma pushed to extreme conditions by a giant x-ray laser at Stanford, where time is precious. Analyzing their data in real time—modeling, for instance, a plasma’s temperature and density—is impossible, because the needed simulations can take days to run, longer than the time the researchers have on the laser. But a DENSE emulator could interpret the data fast enough to modify the experiment, he says. “Hopefully in the future we can do on-the-spot analysis.”