New rules will require researchers to consider whether their results could be misused.

New rules will require researchers to consider whether their results could be misused.

Florence Ivy/Flickr

U.S. asks universities to flag risky pathogen experiments

Academic scientists with federal funding who work with any of 15 dangerous microbes or toxins will soon have to flag specific studies that could potentially be used to cause harm and work with their institutions to reduce risks, according to new U.S. government rules released today.

The long-awaited final rule is similar to a February 2013 draft and is “about what we expected,” says Carrie Wolinetz, a deputy director of federal relations at the Association of American Universities (AAU) in Washington, D.C., which represents more than 60 major research universities. Those schools see the rules as replicating other federal security and safety rules, Wolinetz says, but will adjust to them.

But some observers have concerns, such as that the rules do not apply to other risky biological agents. In a conference call with reporters today, a White House official said the government is open to a “broader discussion” about whether it should expand the list of 15 regulated agents.

The rules are the latest in a flurry of regulations that grew out of the 2001 anthrax attacks and govern experiments that could potentially be used as bioterrorism weapons. Experts identified seven types of experiments that represent so-called dual use research of concern (DURC)—such as making an agent more transmissible or resistant to drugs. In March 2012, federal agencies announced that they would give special scrutiny to such DURC experiments with 15 dangerous agents or toxins; the 15 agents are part of a broader federal list of regulated “select agents” that pose particular risks to public health.

In February 2013, the government released a draft of a follow-on regulation that would require scientists and universities also to screen studies for DURC. And in the wake of controversial experiments that involved making H5N1 bird flu more transmissible among mammals, the government added separate regulations specifically for H5N1 and studies with another bird flu virus, H7N9.

Today’s institutional regulations are “an additional and important component” of an overall framework, said Andrew Hebbeler, assistant director for biological and chemical threats in the White House Office of Science and Technology Policy (OSTP). The rules, which cover institutions receiving federal funding for life sciences research, add a “grassroots approach,” he said. They will require that scientists working with any of the 15 agents who think their work may fall under the DURC definition notify a special review committee within their institution. If this committee agrees the research is DURC, it must notify the funding agency and develop a risk mitigation plan. Institutions can lose federal funding if they do not comply.

The policy includes tight deadlines—30 days for the institutional committee to notify the funding agency that it has identified a DURC experiment, and 90 days to submit the mitigation plan. In response to some of the 38 comments it received on the draft policy, OSTP made some revisions to the process, Hebbeler said. For example, institutions do not have to review proposals that haven’t yet gone through peer review; the institution only needs to show the funding agency that it has a DURC review process in place.

The new rules will take effect a year from now; institutions will need to submit progress reports for ongoing studies. Federal reviews since 2012 have found that of perhaps a “couple hundred” studies involving the 15 agents, “only a handful” met the DURC definition, said Amy Patterson, the National Institutes of Health (NIH) associate director for biosecurity and biosafety policy, on the press call. Scientists and institutions will need help figuring out whether research meets the DURC definition, she added: “This is clearly a learning curve.” NIH is releasing a guidebook and preparing other “tools,” such as case studies and workshops, she said.

AAU’s Wolinetz said institutions have been readying for the final policy, either by adding new duties to existing biosafety review committees or setting up separate DURC review panels. Her group’s concern is that the policy overlaps with existing regulations, including voluminous safety and security rules for working with select agents. The new rule “is unnecessary in a lot of ways,” she says.

Others see the new rules as inadequate. Molecular biologist Richard Ebright of Rutgers University, New Brunswick, a persistent critic of U.S. biodefense research, says the rules don’t require “a bona fide risk-benefit analysis,” which would require weighing risks and benefits, not just identifying them. He’s also concerned because the rules do not cover institutions that don’t receive federal funding for life sciences (although they are covered by select agent rules). And the list of 15 agents does not include some problematic pathogens, such as the viruses that cause SARS and MERS. Those are “glaring omissions,” Ebright says. Hebbeler said OSTP hopes for “an active dialogue with the community” about whether the list should be expanded.

The new rules are also unlikely to assuage concerns about so-called gain-of-function studies that make flu strains such as H5N1 more transmissible or lethal. Concerns have escalated in recent months following new publications and several accidents in federal biocontainment labs. Hebbeler said “we are actively discussing” those controversial studies and that the risks and benefits will be on the agenda at a 22 October meeting of the National Science Advisory Board for Biosecurity, which has not met in nearly 2 years. He declined to comment on whether the government will order a pause in such studies.

Follow News from Science

Latest News

dancing shoes