Read our COVID-19 research and news.

Panel Calls for Closer Oversight of Biomarker Tests


A widely anticipated review of flawed research at Duke University has found broad problems in using genetic signatures to guide cancer treatment. The Institute of Medicine (IOM) report says that so-called "omics" tests—diagnostic tools based on molecular patterns—are highly prone to errors; it recommends they be rigorously validated before being used in clinical trials. The report also calls on journals, funders, and institutions to take steps to avoid what it calls a "failure" of oversight that allowed problems to go unchecked at Duke.

The fallout from the Duke case includes 27 papers that Duke expects to be partially or completely retracted, three cancelled clinical trials, and a lawsuit against Duke by patients in the trials. What happened reflects "a rush" to move genomics-based tests into the clinic and commercialize them, says committee chair Gilbert Omenn, a computational biologist at the University of Michigan, Ann Arbor. "There are a lot of lessons here that surely apply to other places," says Omenn.

The controversy began after Duke researchers led by Anil Potti and his mentor, cancer geneticist Joseph Nevins, reported that patterns of gene activity, or gene expression, in tumor cell lines could be used to predict how individual patients responded to various chemotherapy drugs. Their results first appeared in a 2006 Nature Medicine paper and later in other journals. The university launched three clinical trials in which breast and lung cancer patients were to receive drugs matched to their gene expression results.

Meanwhile, two biostatisticians at MD Anderson Cancer Center in Houston, Texas, Keith Baggerly and Kevin Coombes, found apparent errors in the Nature Medicine paper and related publications and published a critique in September 2009. The National Cancer Institute (NCI), which had questions about results for a Duke gene signature it wanted to use in a new trial, also began to investigate. Duke stopped the trials for a review but restarted them a few months later.

Then in July 2010, after The Cancer Letter, a Washington, D.C., newsletter, reported that Potti had falsely claimed to be a Rhodes Scholar, more than 30 statisticians and bioinformaticians wrote to NCI Director Harold Varmus asking NCI to investigate. Varmus requested the IOM study, and Duke launched a new investigation and halted the trials a second time.

IOM found many potential problems inherent in tests based on omics, which it defines as research that looks for patterns in large sets of molecules such as proteins, DNA, RNA, or metabolites. Such tests offer great potential for guiding patient care but because they can be difficult to reproduce, fewer than expected have reached the clinic, the report notes.

One major problem, the report says, is "overfitting": Because the studies often look for patterns in hundreds of biomolecules using a relatively small number of patient samples, it is easy to find correlations that do not reflect the biology of patients' disease. The report recommends a set of steps to validate the tests, such as repeating the test on blinded samples from a different institution. Journals and funders should also require that data and models from papers be made freely available so that other researchers can check the results.

Before evaluating the test in the clinic, researchers need to "lock down," or freeze the computational model used so that it cannot be changed, the report says. The researchers should also consult with the Food and Drug Administration (FDA) long before they use the test in a clinical trial—which Duke did not do. And FDA needs to clarify its requirements.

The IOM committee also found problems with oversight at Duke, where "multiple systems put in place ... to ensure the integrity and rigor of the scientific process failed." A 31-page appendix offers a blow-by-blow account of what happened at Duke. It notes that financial conflicts of interest (Duke investigators had patents on technology and ties to companies developing the tests) and deference to a senior professor may have influenced the university to dismiss concerns about the papers. Duke attributed its failure to "missed signals."

To avoid these problems, institutions need to strengthen their oversight of conflicts of interest and processes for responding to questions about published research.

MD Anderson's Baggerly says his initial read of the report is "quite positive." If the recommendations had all been in place, "I suspect many of the problems encountered might have been short-circuited," he says.

The NCI's Lisa McShane, who spent months herself trying to validate Duke results, says the IOM committee "did a really fine job" in laying out the issues. NCI now plans to require that its cooperative groups who want to use omics tests follow a checklist similar to that in the IOM report. NCI has not yet decided whether it should add new requirements for omics tests to its peer review process for investigator-initiated grants. But "our hope is that this report will heighten everyone's awareness," McShane says.

In a statement, Duke said it "greatly appreciate[s] the thoughtful and thorough work done by the IOM committee" and expects to incorporate the recommendations into "ongoing efforts to strengthen the rigor of our research enterprise. We have learned from our situation," the statement says.

Potti left Duke in 2010 and later joined a medical practice in South Carolina, but after 60 Minutes aired a segment about the controversy in February, he was let go. A scientific misconduct investigation at Duke is ongoing, according to the university.