Scientists, universities, funding agencies, and journals alike should be doing much more to ensure the reproducibility of scientific research, according to a new report released Monday by the Royal Netherlands Academy of Arts and Sciences (KNAW).
The report adds to a growing number of voices calling for fundamental changes in the way science is conducted and published. It comes in the wake of recent failures to replicate published scientific work, also known as the “reproducibility crisis." A panel at the U.S. National Academy of Sciences is currently also studying reproducibility and replication, and the British Psychological Society is holding an event on the topic later this month.
The KNAW panel, chaired by Johan Mackenbach, a public health researcher at Erasmus Medical Center in Rotterdam, the Netherlands, makes several recommendations to both improve the rigor of original scientific papers and support scientists who conduct replications of previous research. Institutions should put a greater emphasis on training in research design and statistical analysis, the report says, and teach scientists how to conduct replication studies. Journals should require authors to register reports in advance so that the study protocol and analysis plan is locked in place before data collection even begins, and scientists should be encouraged to store methods and data in repositories to help other groups reproduce experiments.
The report also highlights the need to change the incentive structure in publishing and funding. Journals and grant agencies should take into account the rigor of a study, and not only reward innovative methods or novel findings. (In a newspaper interview yesterday, Mackenbach said that between 5% and 10% of research funding should eventually be spent on replication studies.) Journals should also devote more space to publish both replication studies and null results, the group says.
“It’s a nicely balanced report that highlights the challenges for science in general,” says Daniël Lakens, an experimental psychologist at Eindhoven University of Technology in the Netherlands who was consulted by the panel. “It’s good that it acknowledges that this is an issue that we should think about.”
Although Monday’s report—and the discussion in the wider scientific community—concentrates on reproducibility in the life sciences, medicine, and psychology, the committee recommends that all scientific disciplines assess the extent to which results are reproducible. “When we look at the existing analyses of what causes these reproducibility problems, it’s quite clear that the same causes must occur elsewhere,” Mackenbach says.
After a suggestion from Lakens, the Dutch Organization for Scientific Research launched a €3 million fund last year specifically for conducting replication studies; the first nine funded research projects were announced last July. That, and the new report, means the Netherlands is "pretty ahead of things," Lakens says, but reproducibility is a global issue, he emphasizes: “This doesn’t stop at the border. Other countries can easily take over these conclusions because they apply just as much. And I hope they will.”