The first results of a high-profile effort to replicate influential papers in cancer biology are roiling the biomedical community. Of the five studies the project has tackled so far, some involving experimental treatments already in clinical trials, only two could be repeated; one could not, and technical problems stymied the remaining two replication efforts.
Some scientists say these early findings from the Reproducibility Project: Cancer Biology, which appear tomorrow in eLife, bolster concerns that too many basic biomedical studies don’t hold up in other labs. “The composite picture is, there is a reproducibility problem,” says epidemiologist John Ioannidis of Stanford University in Palo Alto, California, an adviser to the project whose attention-getting analyses have argued that biomedical research suffers from systemic flaws.
But others say the results simply show that good studies can be difficult to precisely reproduce, because biological systems are so variable. “People make these flippant comments that science is not reproducible. These first five papers show there are layers of complexity here that make it hard to say that,” says Charles Sawyers, an eLife editor and cancer biologist at Memorial Sloan Kettering Cancer Center in New York City.
The cancer biology project was inspired by reports from two companies that when they tried to follow up on dozens of papers pointing to potential new drugs, they could not replicate as many as 89% of the studies. But the firms, Bayer and Amgen, did not reveal the specific papers or many details of their attempts. So in 2013 the nonprofit Center for Open Science in Charlottesville, Virginia, which had led a replication project for psychology papers, teamed up with Science Exchange of Palo Alto, a service that matches scientists with contract labs that do experiments for hire. The partners won $2 million from a foundation for a large-scale cancer replication effort.
The collaborators identified 50 high-impact preclinical cancer papers published from 2010 to 2012, began writing protocols to replicate key experiments, and published the plans in eLife. The project soon fell behind schedule, however, because information and materials were a challenge to obtain. In the end, no more than 29 replications will be done.
The composite picture is, there is a reproducibility problem.
Among the initial replications, Jay Bradner came up a winner. A contract lab was able to verify a study he co-led with Constantine Mitsiades while at Dana-Farber Cancer Institute, testing an inhibitor of a class of proteins called BET bromodomains. Bradner and Mitsiades had reported in Cell in 2011 that the inhibitor blocks the Myc cancer gene and thereby slows the growth of multiple myeloma tumors in mice. Bradner, now at Novartis Institutes for Biomedical Research in Cambridge, Massachusetts, applauds the overall reproducibility effort and says it is “reassuring” that his study passed muster, especially because the compound is already in clinical trials. He’s not surprised, he says, because it has been replicated by other labs. Another replication attempt supported a 2011 Science Translational Medicine report from Stanford computational biologist Atul Butte’s lab that the ulcer drug Tagamet slows the growth of lung tumors in mice.
Two other replications ran aground, yielding inconclusive results. One was based on a 2012 Nature report that mutations in a gene called PREX2 spur melanoma growth. The replicating lab obtained samples of the human skin cells used in the original study, which had been engineered to slowly form tumors, and implanted them in mice. In the replicators’ hands, however, cancer cells with and without the PREX2 mutations both formed tumors within a week—not weeks as originally reported—making it impossible to tell whether PREX2 made any difference. The original study’s lead investigators, Levi Garraway, who recently moved from Dana-Farber to Eli Lilly, and Lynda Chin of the University of Texas System in Austin, suggest that the genetics of the cultured cells had likely changed over time.
Another lab ran into a similar problem when it tried to replicate work by Stanford stem cell biologist Irving Weissman and his colleagues, who reported in 2012 in the Proceedings of the National Academy of Sciences that an antibody to a tumor cell surface receptor called CD47 can slow tumor growth in mice. In the replicating lab, however, tumors grew extremely slowly in both treated mice and in controls—and in a few cases spontaneously regressed. The replicators couldn’t tell whether the antibody, which is in clinical trials, had affected tumor growth. Weissman notes, however, that other labs have replicated his CD47 results.
An attempt to replicate a 2010 Science report that a peptide called iRGD can help chemotherapy drugs penetrate and shrink prostate tumors in mice was a clear-cut failure. The contract lab found no evidence that the peptide did anything in mice. Yet labs in Germany, China, and Japan have replicated the results, notes Erkki Ruoslahti of the Sanford-Burnham Prebys Medical Discovery Institute in San Diego, California, the lead investigator on the original paper. “The literature is fairly overwhelming that it works if you do it right,” says Ruoslahti, who has filed for patents on iRGD. Albrecht Piiper of the University of Frankfurt in Germany, for example, says the iRGD results were “well reproducible” in his lab.
The commercial iRGD used by the replicating lab could be one explanation, Piiper and others say. Ruoslahti points out that the contract lab apparently did not do any assays to verify that the peptide had activity before testing its effect on tumors. Project Manager Tim Errington of the Center for Open Science responds that the team was forced to have a company make the peptide because Ruoslahti turned down their request for it. Ruoslahti says he doesn’t now recall whether that happened. He hopes to move iRGD into clinical trials but worries the failed replication will hinder efforts to find financing.
Some scientists are frustrated by the Reproducibility Project’s decision to stick to the rigid protocol registered with eLife for each replication, which left no room for trouble-shooting. The two inconclusive experiments both involved growing implanted tumor tissue in mice, which is challenging and takes practice, Weissman says. “Academic labs would have tried again,in a different way with a different cell dose, but [the replicators] were bound by the registered report,” says cancer stem cell biologist Sean Morrison of the University of Texas Southwestern Medical Center in Dallas, an eLife editor.
Science Exchange CEO Elizabeth Iorns, a cancer biologist herself, defends the project’s approach: “Following a process like this openly is quite rare. People really do want to know what happens when you try to do this.” Because of timing and budget issues—protocol reviewers’ requests added 25% more experiments, Iorns says—the project pared its list of replications. The first five cost $27,200 each on average, close to the original estimate of $26,000, but for others the cost will likely rise, Iorns says.
The National Institutes of Health (NIH) in Bethesda, Maryland, which helped fund the original studies, is glad they are coming under the microscope. NIH Principal Deputy Director Lawrence Tabak applauded the first replication results, saying they “highlight some of the incredible complexity that biology affords us and are reminders of how important it is to use rigorous methods that are completely transparent.”
*Correction, 18 January, 3:30 p.m.: A previous version of this story incorrectly identified Lynda Chin’s affiliation as MD Anderson Cancer Center. She is now based in Austin with the University of Texas System.