Read our COVID-19 research and news.

Credit: G. Grullón/Science

Practical steps to make lab workers safer

Does this sound familiar?

Students and postdocs “do not feel empowered to address their concerns with others within the lab or with the faculty adviser. They also do not believe that they can move forward to effect positive safety changes without negative or punitive consequences … .”

How about this?

“Principal investigators operate autonomously, exercising significant authority over the research and the research personnel in their individual laboratories, and in some cases may regard good safety practices, such as inspections by outsiders or following established safety procedures, as a barrier to research progress and a violation of their academic freedom.”

Many—perhaps most—graduate students and postdocs can undoubtedly identify with these descriptions, which are taken from Safe Science: Promoting a Culture of Safety in Academic Chemical Research, an intelligent and informative report published 31 July by the National Academies' National Research Council (NRC).

The report calls itself a response to recent “serious and sometimes fatal” injuries to rank-and-file lab workers—not only students and postdocs but also technicians. These front-line lab workers face the greatest risk of injury from lax safety practices and know lab conditions best—yet they are often powerless to make their working lives safer.

As is usual for NRC reports, the committee that wrote this one is made up mostly of influential people—academic leaders, distinguished faculty members, and leading experts—so it’s not surprising that most of its conclusions and recommendations take a top-down, long-term view. Two of the recommendations, though, could be implemented fairly quickly and would empower those voiceless researchers who do the bulk of experimental work to do more to help ensure their own safety.

Learning from mistakes

The crux of the committee’s advice is that universities need to build robust and pervasive safety cultures. The concept of safety culture, the report explains, emerged as experts searched for the causes of the 1986 Chernobyl nuclear power plant disaster. Safety culture encompasses “the organizational context in which all actions pertinent to safety occur.” In the sort of “strong, positive” safety culture the committee believes universities should develop—and that already exists in major industrial research labs—people work safely “not because of a set of rules, but because of a commitment to safety throughout an organization” that integrates “safety as an essential element in the daily work of laboratory researchers.” Life in labs with a strong safety culture “supports the free exchange of safety information, emphasizes learning and improvement, and assigns greater importance to identifying and solving problems rather than placing blame.” In such labs, safety has “high importance … all the time, not just when it is convenient or does not threaten personal or institutional productivity goals.”

Such a vision is alien to many universities and would take a lot of time and effort to instill. But work could get underway promptly on a national system for confidential reporting of lab-safety incidents and near misses. The report lists several organizations it says “should work together to establish and maintain [such a] system, building on industry efforts, for centralizing the collection of information about and lessons learned from incidents and near misses in academic laboratories, and linking these data to the scientific literature” for use in safety research and training. “Department chairs and university leadership should incorporate the use of this system into their safety planning. Principal investigators should require their students to” use it.

The idea behind the proposed system comes from the Aviation Safety Reporting System (ASRS) of the Federal Aviation Administration, which pilots, air traffic controllers, mechanics, cabin crew, and others use to report—anonymously if they wish—near misses, close calls, and errors. More than a million such reports have arrived since the system’s inception in 1976; more than 70,000 reports were filed in 2012 alone. Experts analyze the reports and, when appropriate, issue alert messages to the industry. All the reports, with identifying information removed, are available online, where the “narratives provide an exceptionally rich source of information for policy development, human factors research, education, training, and more,” the ASRS website says. The authors of the NRC report hope to build a new academic lab-safety system that would work in much the same way, giving voice to the many lab workers who currently are powerless to improve safety and ensure that near misses become learning opportunities.

A second recommendation that could increase lab worker’s ability to improve their own safety calls for “the researcher and principal investigator [to] incorporate hazard analysis into laboratory notebooks prior to experiments, integrate hazard analysis into the research process, and ensure that it is specific to the laboratory and research topic area.” Making such discussions of risk and mitigation routine would increase the attention paid to these crucial issues in many labs.

The committee’s other recommendations generally require action by policymakers high up in each university’s power structure. They emphasize the need for strong and explicit commitment by top university leadership to safety culture as “a core value of the institution” and policies to back up and implement that commitment. The most interesting of these would change “criteria for promotion, tenure, and salary decisions for faculty” to include efforts to foster and maintain safety culture in the principal investigator’s (PI’s) lab and department. Suggestions (which, in standard NRC practice, are distinct from formal recommendations) include allocating departmental or university funds to finance safety equipment and hazard analysis not covered by grants; holding safety discussions in regular lab and departmental meetings; and allotting “start-up funds and other renovations … in part based on a department’s or unit’s safety practices.” In addition, “funding agencies may choose to include [steps to foster safety] in grant evaluations, for example, as part of the ‘broader impacts’ sections that are now being required in National Science Foundation grant proposals.”

Here’s an idea that isn’t included in the report: Safety-minded students and postdocs should read Safe Science and bring its informative analysis and useful suggestions to the attention of their lab chiefs, department chairs, deans, and other university officials.

Overcoming obstacles

As Safe Science notes, the “specialized and insular structure and hierarchical nature of academic research can pose challenges to the development” of safety culture. Among the “most recalcitrant” of these is “the attitude, unfortunately often reinforced by principal investigators, that safety practices are time-wasting inhibitions to research productivity.” “Efforts must be found to convince such people that working safely enhances, rather than inhibits, research productivity.”

One method of persuasion that a number of well-regarded safety experts and reports consider necessary is missing from the recommendations: making a PI’s safety record a criterion for funding. “When negligent or cavalier treatment of laboratory safety regulations jeopardizes everybody’s ability to obtain funding, a powerful incentive is created to improve laboratory safety,” states another NRC publication, the 2011 revision of the widely used Prudent Practices in the Laboratory. The Safe Science committee, however, considers the use of funding as a stick to enforce lab safety to be controversial and prefers to use a “carrot” to drive change. Also controversial, according to some observers, is whether carrots alone can overcome the many formidable barriers blocking change in academic safety culture that the report amply enumerates.