In April 2009, an earthquake hit the town of L'Aquila in central Italy, killing 309 people. More than 3 years later, four scientists, two engineers, and a government official, all members of Italy's National Commission for the Forecast and Prevention of Major Risks at the time of the earthquake, were found guilty of involuntary manslaughter. They were convicted, the judge said, not because they had failed to predict the earthquake, but because they had failed to analyze and explain the risks adequately and had given false reassurances to the townsfolk. The seven commission members received a 6-year prison sentence that sparked protests from scientists around the world, concerned that the verdict would push researchers to keep silent about risks in the future.
[Y]ou have to be careful if you start putting your head up.
Early-career scientists rarely sit on high-profile committees, but they may, nonetheless, uncover important information about threats to security or wellbeing. And they are especially vulnerable: Getting embroiled in a legal case could damage a budding career.
So what are the professional responsibilities of scientists when it comes to communicating risk? And, in the aftermath of the L’Aquila case, how should they go about meeting those responsibilities?
Liability and professional responsibility
The Italian verdict was expected to have a “chilling effect” on scientists. But according to Andrew Rosenberg, director of the Center for Science and Democracy at the Union of Concerned Scientists (UCS) in Cambridge, Massachusetts, the L'Aquila case remains an “aberration” and not a dangerous legal precedent. Provided their public statements are based on sound research methods and findings, scientists generally have no reason to worry about legal backlash. “If people adhere, not dispassionately but clearly to their results, and can explain where the information comes from, then the liability in general is quite low,” Rosenberg says.
Scientists with relevant expertise do, however, have a professional responsibility—even a duty—to communicate to the public the best possible evidence about risk, says Andrew Maynard, director of the University of Michigan Risk Science Center. “From an ethics perspective, we have the responsibility to empower people to make the best possible decision,” Maynard says. He insists, however, that scientists should be careful not to overstep the fine line between helping people make decisions and telling them what to do. “Scientists need to partner ... with others in decision making, rather than try and dominate the conversation.”
At UCS, Rosenberg advocates a somewhat more assertive posture. When he worked as a lead regulator in the Northeast, his studies of New England and mid-Atlantic fisheries inspired him to raise an alarm about overexploitation of the resource. “We spent a lot of time in meetings getting yelled at, but it turned out the advice was correct,” he says. Measures taken then to curb overfishing helped some of the fish stocks recover, Rosenberg says. “Communicating risk is not the same as communicating uncertainty,” he argues. “It's important to know [and convey] where the weight of evidence is.” Dwelling too much on uncertainty can be paralyzing, he contends.
While researchers may play their part by communicating their own results, David Spiegelhalter, the Winton Professor for the Public Understanding of Risk in the statistical laboratory at the University of Cambridge in the United Kingdom, says scientists can also police what others say about risk. “The media want to make things exciting and usually alarming, so there's a tendency to present figures in a way that makes them look dramatic, and we should be able to take these stories apart,” he says. Although not every scientist will be willing to take on that public role, “it's important that it's not just left to the very senior people to draw attention to the misuse of evidence or statistics.” Whether or not their work is specifically on risk assessment, scientists at any career stage usually understand statistics—if they don’t, they should learn—and know how scientific evidence is used to draw conclusions, he adds.
Online tools provide cheap, easy opportunities to offer commentary on risk. Spiegelhalter uses his personal blog and Twitter account. Maynard produces short YouTube videos where he discusses the risks of electronic cigarettes or HPV vaccines, among other topics.
One of the most important aspects of communicating risk is to appreciate the extent and seriousness of the risk and to strike the right balance between informing and alarming.
While scientists and risk professionals most often take a rational approach to deciding when a risk is big enough to speak up, they need to understand that the public (or scientists outside their field of expertise) may perceive and rank risks differently. For example, how a risk is spread over time and space will greatly change how people perceive it, writes Peter Sandman, a U.S. risk consultant based in Brooklyn, New York, on his website. “Hazard A kills 50 anonymous people a year across the country. Hazard B has one chance in 10 of wiping out its neighborhood of 5,000 people sometime in the next decade. Risk assessment tells us the two have the same expected annual mortality: 50. 'Outrage assessment' tells us A is probably acceptable and B is certainly not.” Before they talk about risks, then, scientists must anticipate the likely response and pitch their message appropriately. (In a recent article published on Science Careers’ sister site, ScienceInsider, Sandman and his wife and colleague Jody Lanard offered advice to scientists, government officials, and journalists on what messages they should stress when talking about the spread of the Ebola virus.)
In addition, the concepts of absolute or relative risks must be carefully deployed, depending on the situation, Spiegelhalter says. Absolute risk expresses the likelihood of something happening as a percentage or ratio, while the relative risk compares risk levels in different groups. Using either of these can make risks look more or less important, depending on the context, Spiegelhalter says. In L'Aquila, the absolute risk of a violent earthquake was low, but it was higher than usual, and the potential damage was high, so using an absolute figure could appear to minimize the risk. “On the other hand, some people will make a fuss about a particular diet increasing your risk of a particular disease by 20%” compared to other diets, even though the risk may be low over a person's lifetime, Spiegelhalter says. In that case, referring to absolute risk could be helpful to provide perspective.
As a successful example of risk communication, Spiegelhalter mentions leaflets published by the U.K. National Health Service about breast cancer screening, which he helped write and design. In the United Kingdom, all women aged 50 to 70 are invited for breast screening every 3 years; the leaflet helps them decide whether to take the test. The issue divides the medical community, Spiegelhalter says, but the publication has been well received because it spells out risks and benefits clearly and uses real-life figures that the public can easily grasp. “Screening saves about 1 life from breast cancer for every 200 women who are screened,” according to the leaflet. On the other hand, “for every 1 woman who has her life saved from breast cancer, about 3 women are diagnosed with a cancer that would never have become life-threatening” and are offered treatment they don't need.
Before they get engaged with risk communication, scientists should understand “whatever rules, policies, and laws are in place” where they live and work, says Jeff Rubin, former chair of the Geological Society of America's Geology and Public Policy Committee and emergency manager of the Tualatin Valley Fire and Rescue in Tigard, Oregon. This includes their own institution’s procedures: “Talk to a supervisor or someone responsible for public relations [and] get advice. … There may be policies in place” to deal with this situation, he advises.
Still, “you have to be careful if you start putting your head up,” Spiegelhalter says. He suggests that young scientists first watch what senior, high-profile scientists do. Scientists sometimes face personal attacks when they speak out about risk, Rosenberg admits, especially if their findings threaten economic interests. Researchers who raised the alarm about the risks of exposure to asbestos in the 1960s faced backlash from asbestos manufacturers, for example. Rosenberg advises that early-career scientists seek feedback and mentoring from other scientists. One way to do this is to raise issues with other researchers through online networks like ResearchGate. UCS has also published an online guide to help researchers deal with harassment.
While getting training in statistics or communication from your university or scientific society can provide you with tools to better communicate risk, your best safety net is probably your expertise. Above all, your research and knowledge should be rock-solid, Spiegelhalter says: “You don't want to harm your career. You have to know what you're talking about.”