On 1 July, the University of Hawaii (UH), Manoa, presented the results of an exhaustive independent investigation into the 16 March lab explosion in which postdoc Thea Ekins-Coward lost her right arm. Though delayed several times, the two-part report from the University of California Center for Laboratory Safety (UCCLS) is well worth the wait. Based on extensive forensic testing, its analysis of why the tank of pressurized oxygen, hydrogen, and carbon dioxide blew up is thorough, precise, and illuminating. Its recommendations for how UH and other universities can prevent similar calamities in the future are thoughtful, detailed, and explicit.
But, in a larger sense, getting the investigators’ final word was an anticlimax. The fact is that, months before I read the report, I already pretty much knew what it would say. It’s not that I have any technical expertise in gas cylinders—far from it! And I don’t mean that I knew the precise details about why this particular detonation occurred. It was news to me that sparking by an inappropriate pressure gauge, which Honolulu Fire Department (HFD) investigators suggested as the cause of the explosion in April, was not in fact to blame. Rather, the explosion occurred because of a transfer of static electricity to the tank, which lacked the “critical” safeguard of grounding, the UCCLS report explains.
It’s just that I’ve read enough reports on academic lab disasters to know that, when it comes to fundamentals, they’re all pretty much the same. The basic reason for the disaster was familiar from reading about it time after time before. I’ve read about it concerning the fire that caused the 2009 death of University of California, Los Angeles (UCLA), laboratory assistant Sheharbano “Sheri” Sangji; concerning the 2010 explosion at Texas Tech University that maimed graduate student Preston Brown; and concerning the fatal 2011 strangulation of Yale undergraduate Michele Dufault, not to mention several other incidents.
Like all those earlier disasters, the one in Hawaii was totally predictable and preventable, this most recent report shows. Like them, it happened because people with power and authority over the laboratory failed to give primacy to lab workers’ lives and safety.
In each case, disregard of this basic value translated into failures to enable and require researchers to evaluate the hazards inherent in their experiments, to assess and mitigate the resulting risks, to adhere to recognized safety practices, and to use all necessary protective equipment. Beyond that, both the researchers and institutions failed to heed and learn from clear warnings of danger.
“[T]he overall underlying cause,” the UCCLS report’s recommendations section states, “was failure to recognize and control the hazards of an explosive gas mixture. … The safety program at UH was not designed to assist researchers in identifying hazards, making risk assessments, and controlling laboratory hazards.” Furthermore, when a “near miss event”—in this case, a small explosion in a tank—“occurred just prior” to the disaster, “none of the researchers related this … event to the similar hazards posed by other ongoing experiments involving even larger quantities of the gas mixture.”
The U.S. Chemical Safety Board’s (CSB’s) report on the Texas Tech explosion also notes “systemic deficiencies … that contributed to the incident.” The fact that experimental risks “were not effectively assessed, planned for, or mitigated” was a contributing factor. The university’s system of “safety management accountability and oversight” was inadequate, and no one took note of or learned from earlier incidents that provided “preventative lessons” about impending danger. Without “formal hazard evaluation and risk assessment,” there was no “plan for the worst-case scenario,” the report notes.
The California Division of Occupational Safety and Health’s investigative report on the UCLA fire identified similar deficiencies. The university lacked “adequate lab safety training and documentation [and] effective hazard communication” and “repeated[ly] fail[ed] to correct persistent and repeated safety violations.” The university’s Office of Environment, Health and Safety (EH&S) knew about “continuous and pervasive safety violations” but did not force them to be corrected. “[E]ven after … two incidents that resulted in significant burn injuries to employees, … the EH&S Department failed to take any affirmative steps to abate a rather clear and appreciable danger.”
No full-scale report was issued regarding the death at Yale because the victim, as a student, did not come under occupation safety laws. A letter to the university from the U.S. Occupational Safety and Health Administration, which Yale disputed, nonetheless implicates a lack of training and of precautions to mitigate the risks of a dangerous lathe. Nor was there any report that we know of about the hydrogen tank explosion that killed postdoc Meng Xiangjian at Tsinghua University in Beijing in December. Still, as a scientist at the Institute of Chemistry at the Chinese Academy of Sciences told China Daily, “[c]ertain hazards will be found in individual experiments, but, in general, fire and explosions are preventable if all necessary steps have been taken”—the implication being that, in this instance, they were not.
The same bottom line
I won’t try to summarize the bountiful and extremely valuable information in the UCCLS report about the UH explosion. The many pages of technical analysis of the tank, its contents, and use—along with specific recommendations for its safe handling—should greatly interest anyone whose research involves such materials. Beyond that, there’s a penetrating and insightful critique of UH’s deficient safety regime, plus detailed suggestions for attacking its cultural and systemic problems.
But it’s not as if UH was actively trying to be unsafe. At the news conference held the day after the explosion, university officials appeared quite sincere in claiming that UH strove to maintain a good safety program and in explaining that the lab had met all requirements in its annual inspection. But, as the report makes piercingly clear, that is not enough. “Most importantly,” it says, “an effective laboratory safety program must be integrated into the research process rather than being an annual housekeeping exercise conducted days before an anticipated annual laboratory inspection” (italics added). Beyond that, such a program “needs to be thorough, consistent and sustained within the research institution. Firm guidance and support must be provided by campus leadership. It must be embraced at every level of the institution from the Chancellor down to beginning students or newly hired staff.”
The truth is that UH probably isn’t any worse than a lot of other universities. In much of academe, the report notes, “[s]olving technical challenges in experiments are seen as a higher priority than considering the risks of the process,” and “[f]ormal risk assessments are typically not integrated into planning and conducting experimental procedures” or “done when changing experimental protocols involving highly hazardous chemicals or processes.” Ekins-Coward was reportedly using a modified procedure for the first time when the explosion occurred, as we noted when the HFD issued its conclusions.
And the UCCLS report makes yet another crucial point: Neither of the organizations funding the research she was working on—Bio-on, a European company that makes materials from renewable sources, and the U.S. Office of Naval Research—“requested a risk analysis for this work with explosive materials,” the report notes. “[S]ome federal agencies do have specific requirements on safety,” it continues, “but it is not a general practice to require proposals to contain a) identification of hazards in the proposed research, b) strategies [for] how those hazards will be mitigated, and c) information regarding how lab workers will be trained on the project hazards.”
One agency that does demand such information is the Department of Homeland Security (DHS), which sponsored Preston Brown’s research at Texas Tech; it instituted these requirements in the aftermath of the explosion there, as the CSB report notes. “To ensure that researchers and research facilities funded through the DHS [Center of Excellence] award meet the highest safety standards possible,” the UCCLS report states, “DHS requires every recipient to develop a Research Safety Plan” showing, among other things, that the researcher has identified the potential hazards in the research and will use accepted and appropriate protocols and practices; that the institution provides faculty oversight for students and education and training to develop a culture of safety; and that subject matter experts not involved in the research review protocols and practices. Likewise, in the wake of Sheri Sangji’s death, UC established UCCLS.
So clearly some organizations are learning. The new report also makes a fine addition to the already groaning shelf of excellent investigations into serious academic lab disasters. But how many more meticulous and well-meaning investigations must be done before those who have not yet suffered get the message? It gets really tiring reading so many strikingly similar studies. Here’s hoping—but, sadly, not expecting—that this will be the last.