Read our COVID-19 research and news.

Heat from radioactivity, not starlight, could warm planets enough to allow liquid water to exist on their surfaces.

Дмитрий Ларичев/

No star, no problem: Radioactivity could make otherwise frozen planets habitable

Not too close, but not too far. That’s long been the rule describing how distant a planet should be from its star in order to sustain life. But a new study challenges that adage: A planet can maintain water and other liquids on its surface if it’s heated, not by starlight, but by radioactive decay, researchers calculate. That opens up the possibility for many planets—even free-floating worlds untethered to stars—to host life, they speculate.

Radioactive isotopes such as uranium-238, thorium-232, and potassium-40 pepper Earth’s crust and mantle. As these unstable radionuclides decay, they generate a small amount of power—roughly one-thirty-thousandth that received from the Sun. But researchers have now proposed that some planets, particularly ones that form near the center of our Milky Way Galaxy, might possess enough of these radioactive isotopes to generate sufficient heat to keep their surfaces from freezing entirely solid.

“That gives you the freedom to be anywhere,” says Avi Loeb, an astrophysicist at Harvard University and a co-author of the new study. “You don’t need to be close to a star.”

Loeb and Manasvi Lingam, an astrobiologist at the Florida Institute of Technology, looked at three sources of heat for a sunless planet: heat leftover from its formation, the radioactive decay of long-lived isotopes over billions of years, and the radioactive decay of short-lived isotopes over hundreds of thousands of years. They then modeled the surface temperatures of planets with different masses and radionuclide abundances to determine whether water, ammonia, and ethane—three solvents found in the Solar System—could exist as liquids.

Warming a planet enough to liquify water requires roughly 1000 times Earth’s abundance of both types of radioactive isotopes, Lingman and Loeb report in The Astrophysical Journal Letters. Lingam and Loeb found that planets with the same mass as Earth but with about 100 times the abundance of radionuclides would pump out enough heat to keep ethane liquid over hundreds of millions of years. The radiation levels on such worlds would be hundreds of times higher than the time-averaged doses Chernobyl residents experienced after the Ukrainian nuclear disaster in 1986, Lingam and Loeb estimated.

It’s unlikely that multicellular life would survive such irradiation, Lingam says. But some of Earth’s most extreme microbes would have better than a fighting chance. For instance, Deinococcus radiodurans, a highly radiation-resistant bacterium, would do just fine, Lingam says. “Deinococcus radiodurans is a really crazy organism.”

Could a single planet amass such a large stockpile of radionuclides? That’s the key question, Loeb says. Such worlds, if they existed in our own Galaxy, would probably have to be born near the center of the Milky Way. That’s because heavy elements such as uranium and thorium are thought to be produced in collisions between neutron stars, and such collisions are more likely to occur in the densely crowded center of the Galaxy.

But finding such a planet would come as a surprise because it’s so unlike the other worlds in our solar system, says Tim Lichtenberg, a planetary scientist at the University of Oxford who was not involved in the research. “It’s hard to argue that it’s impossible. But it’s definitely not the norm.”

If one of these worlds does exist, the James Webb Space Telescope, slated to launch in 2021, might be able to spot it by virtue of the radiation it would emit. But one of the telescope’s cameras would need roughly 10 days to detect the signal, which would be strongest in the infrared, Lingam and Loeb calculated. And that exposure estimate could vary wildly depending on the planet’s age, radionuclide abundance, and mass. “There are so many unknowns,” Lingam says. “We haven’t said the last word.”