The U.S. research community needs to do a better job of both investigating misconduct allegations and promoting ethical conduct—or the government might act unilaterally in ways that scientists won’t like.
That’s the implicit message sent by a new report out today from the National Academies of Sciences, Engineering, and Medicine entitled Fostering Integrity in Research. The report’s key recommendation is that universities and scientific societies create, operate, and fund a new, independent, nongovernmental Research Integrity Advisory Board (RIAB). The board would serve as a clearinghouse to raise awareness of the issues, as an honest broker to mediate disagreements, and as a beacon to help institutions that lack the knowledge or resources to root out bad behavior and foster good behavior.
Other entities are already doing these things, but none has research integrity as its sole focus nor covers so much territory. Federal funding agencies investigate and punish miscreants who misuse taxpayer dollars, universities train scientists as part of their mission to advance knowledge, and scientific societies and journals have adopted ethical standards for their authors and members. After reviewing that landscape, the committee concluded that all of those organizations need to step up their game.
“We don’t think the system is broken, but we think there is a lot more we as a community can do,” says Robert Nerem, professor emeritus of bioengineering at Georgia Institute of Technology in Atlanta and chair of the committee that wrote the report.
And Nerem doesn’t think there’s any time to waste. “If we don’t do better, Congress could step in” and act unilaterally, he warns. “And I don’t think Congress [would have] the best interests of the research enterprise at heart in moving forward. Some legislators are champions of research, but I would be leery of what would come out of Congress.”
A long, hard slog
The committee began its work in 2012 with the goal of updating a 1992 National Academies report, Responsible Science, which made a similar pitch for an advisory body. (It went unheeded.) The 1992 report was prompted by a wave of research misconduct cases that soiled science’s reputation and led to the creation of the Office of Research Integrity (ORI) to investigate misconduct involving federally funded biomedical research.
But what began as a 2-year project wound up taking quite a bit longer. “The research enterprise has grown a lot in the past 25 years,” Nerem says, adding that the rise in interdisciplinary research, new technologies, concerns about reproducibility, and increased international collaboration have made research integrity a much more complex issue. The committee almost ran out of money at one point, and the exhaustive internal vetting process given every Academies report took even longer than usual.
Despite the widespread adoption of ethical standards by the scientific community, research misconduct continues to be a thorn in the side of the scientific community. ORI and its counterpart at the National Science Foundation (NSF) receive hundreds of allegations every year, which result in dozens of findings of misconduct.
Rather than hiding its head in the sand, the committee decided that self-examination might help the community do its job better. An appendix to its report includes five notorious cases, including one several years ago at Duke University in Durham, North Carolina, involving fabrication by cancer researcher Anil Potti, in which Nerem says the school’s response to allegations of misconduct was as flawed as the behavior itself. “As good a research institution as Duke is, clearly, they fell down on the job,” Nerem says.
Although ORI concluded the Potti investigation in 2015, Nerem says he’s not convinced the institution has learned its lesson. “I don’t know if Duke has changed its practices, but it wouldn’t surprise me if it happened again,” he says. “After all, people are only human.”
The best way to reduce the number of such ugly incidents, Nerem believes, is for top university officials to make clear that training in the responsible conduct of research (RCR) is a priority at their institution. And that’s where the proposed advisory board could help.
A creation of the community, the RIAB would “have no direct role in investigations, regulation, or accreditation,” the report says. Nor would it set enforceable standards for the profession. But Nerem and others believe that the board’s focus on fostering ethical behavior could encourage institutions to pay more attention to the issue.
Marcia McNutt, president of the National Academy of Sciences in Washington, D.C. (and former Science editor-in-chief), thinks the board could be especially helpful to small institutions. “I see it as the great equalizer,” she says. “Big research organizations are well-oiled machines, but a smaller organization may not know how to proceed with an investigation or create an RCR training program.”
Another way to improve the quality of investigations would be to subject them to external peer review. “We do it for journal articles, and for grant proposals, so why not use outside experts on investigations as well?” Nerem asks.
The committee looked at various mechanisms for establishing the board and decided that it should sit outside the federal umbrella. “Most universities feel they are already being micromanaged by the government, and having this board be part of [the federal bureaucracy] might reinforce that,” he says.
McNutt sees another reason for its independence. “The federal government has been slow to adopt best practices on research integrity,” she says. “So I’m not surprised that the committee feels that it would be better for the board to be independent.”
The makeup and location of the board would be up to its founding members, who would also finance its operations. The panel proposed that the board have an annual budget of $3 million, with the money coming from federal agencies, foundations, and universities. Academic institutions would pay dues on a sliding scale based on how much research they perform.
McNutt says that the National Academies might consider a request to provide physical space for the board but that otherwise her organization would remain at arm’s length. That would also allow the Academies to carry out an independent evaluation of the board’s activities if it were asked to do so, she adds.
More than a checklist
Coincidentally, a new study in Science and Engineering Ethics highlights some gaps in training that the board might address. In 2007, Congress ordered NSF to require every grant application to include a plan for RCR training of all postdoctoral researchers, undergraduates, graduate students, and postdoctoral researchers who would be participating in the proposed research. But the study finds that most universities have adopted bare bones programs that appear tailored to simply comply with the requirement rather than to truly educate the next generation of scientists by following best practices in the field. Most consist of online instruction rather than face-to-face interaction (see graph, above), deliver a single “dose” rather than ongoing ethics training, and take a one-size-fits-all approach that doesn’t differentiate by discipline or the student’s academic status.
“If you think of [RCR] as compliance, you’re missing the boat,” says medical ethicist Elizabeth Heitman of the University of Texas Southwestern Medical Center in Dallas, a co-author of the study. “The problem is that it’s become a checklist.”
The authors suggest that NSF also shoulders some responsibility for the current situation. NSF doesn’t require institutions to document that they are actually providing the training, giving them broad latitude to design their programs. Nor does NSF do any systematic monitoring of those efforts. In response, an agency spokesperson says that “NSF believes that … education in [RCR] is essential in the preparation of future scientists and engineers.”
Heitman believes the fundamental problem is that many universities view ethics training as an add-on rather than as part of their core mission. That leads to a cascade of poor practices, she says, including leaving the training to others, doing it on the cheap, and dropping it when a funding agency no longer requires it to win a grant. “We don’t outsource biostatistics or biochemistry education, but we think differently about ethics and research integrity,” she laments.
The impact of federal research policies on research integrity could be another ripe area for the new board to examine, notes the Academies report. Increasing competition for funding and the need for high-profile publications in order to win academic promotion can generate “detrimental research practices” that fall short of the federal definition of misconduct as fabrication, falsification, and plagiarism but nevertheless stray from acceptable norms.
“The ultimate goal of the board,” says McNutt, “would be to create a climate in which we never have to investigate research misconduct because it never occurs.”