When Philip Munday discussed his research on ocean acidification with more than 70 colleagues and students in a December 2020 Zoom meeting, he wasn’t just giving a confident overview of a decade’s worth of science. Munday, a marine ecologist at James Cook University (JCU), Townsville, was speaking to defend his scientific legacy.
Munday has co-authored more than 250 papers and drawn scores of aspiring scientists to Townsville, a mecca of marine biology on Australia’s northeastern coast. He is best known for pioneering work on the effects of the oceans’ changing chemistry on fish, part of it carried out with Danielle Dixson, a U.S. biologist who obtained her Ph.D. under Munday’s supervision in 2012 and has since become a successful lab head at the University of Delaware (UD), Lewes.
In 2009, Munday and Dixson began to publish evidence that ocean acidification—a knock-on effect of the rising carbon dioxide (CO2) level in Earth’s atmosphere—has a range of striking effects on fish behavior, such as making them bolder and steering them toward chemicals produced by their predators. As one journalist covering the research put it, “Ocean acidification can mess with a fish’s mind.” The findings, included in a 2014 report from the Intergovernmental Panel on Climate Change (IPCC), could ultimately have “profound consequences for marine diversity” and fisheries, Munday and Dixson warned.
But their work has come under attack. In January 2020, a group of seven young scientists, led by fish physiologist Timothy Clark of Deakin University in Geelong, Australia, published a Nature paper reporting that in a massive, 3-year study, they didn’t see these dramatic effects of acidification on fish behavior at all.
The paper has proved so polarizing in the field, “It’s like Republicans and Democrats,” says co-author Dominique Roche of Carleton University in Ottawa, Canada. Some scientists hailed it as a stellar example of research replication that cast doubt on extraordinary claims that should have received closer scrutiny from the start. “It is by far the best environmental science paper I have read for a long time,” declared ecotoxicologist John Sumpter of Brunel University London.
Others have criticized the paper as needlessly aggressive. Although Clark and his colleagues didn’t use science’s F-word, fabrication, they did say “methodological or analytical weaknesses” might have led to irreproducible results. And many in the research community knew the seven authors take a strong interest in sloppy science and fraud—they had blown the whistle on a 2016 Science paper by another former Ph.D. student of Munday’s that was subsequently deemed fraudulent and retracted—and felt the Nature paper hinted at malfeasance. The seven were an “odd little bro-pocket” whose “whole point is to harm other scientists,” marine ecologist John Bruno of the University of North Carolina, Chapel Hill—who hasn’t collaborated with Dixson and Munday—tweeted in October 2020. “The cruelty is the driving force of the work.”
What few researchers know is that in August 2020, Clark and three others in the group took another, far bigger step: They asked three funders that together spent millions on Dixson’s and Munday’s work—the Australian Research Council (ARC), the U.S. National Science Foundation (NSF), and the U.S. National Institutes of Health (NIH)—to investigate possible fraud in 22 papers.
The request, which they shared with a Science reporter, rests on what they say is evidence of manipulation in publicly available raw data files for two papers, one published in Science, the other in Nature Climate Change, combined with remarkably large and “statistically impossible” effects from CO2 reported in many of the other papers. They also provided testimony from former members of the Dixson and Munday labs, some of whom monitored Dixson’s activities and concluded she made up data.
ARC and NSF declined to discuss the case with Science, but said they generally refer such cases to the research institutions—in this case JCU; the Georgia Institute of Technology, where Dixson worked between 2011 and 2015; and UD. NIH said it refers cases to the U.S. Office of Research Integrity, which does not comment on cases.
Munday calls the allegations of fraud “abhorrent” and “slanderous,” and a JCU spokesperson says the university has dismissed the allegations after a preliminary investigation. (Munday retired from JCU in April and has moved to Tasmania, but emphasizes there is no connection between that timing and the allegations.) UD says it cannot comment on personnel matters; a Georgia Tech spokesperson declined to comment except to say the institute “takes all allegations of research misconduct seriously.” Dixson denies making up data as well. “I fully stand by all the data I’ve collected, I stand by the papers that we’ve published,” she told Science in a February interview. “The data was collected with integrity. I mean, I preach that to my students.”
But multiple scientists and data experts unconnected to the Clark group who reviewed the case at Science’s request flagged a host of problems in the two data sets, and one of them found what he says are serious irregularities in the data for additional papers co-authored by Munday.
The fight, between two groups united by their passion for fish, isn’t just about data and the future of the oceans. It highlights issues in the sociology, psychology, and politics of science, including pressure on researchers to publish in top-tier journals, the journals’ thirst for eye-catching and alarming findings, and the risks involved in whistleblowing.
I stand by the papers that we’ve published. … The data was collected with integrity. I mean, I preach that to my students.
Members of the Clark group say they will soon publicize the alleged data problems on PubPeer, a website for discussion of published work. And they say they thought long and hard about whether to discuss their concerns with a reporter while investigations may be ongoing. “In my experience, whistleblowers, myself as well as others, are shamed for talking to the media before an investigation has concluded misconduct,” says Josefin Sundin of the Swedish University of Agricultural Sciences, the last author on the Nature replication paper. “But why is that? If an investigation even takes place, it can drag on for a very long time. If you know that data have been fabricated, why is it considered the right thing to do to stay silent about it for months and even years?”
Townsville may be one of the world’s best places to go if you want to become a marine biologist. JCU’s website boasts “a unique tropical learning environment with research stations, state-of-the-art laboratories and the Great Barrier Reef right on our doorstep.” The university is home to the ARC Centre of Excellence for Coral Reef Studies, where Munday had his lab. For field studies, scientists fly to Lizard Island, a granite rock on the reef that’s legendary among marine biologists, thanks to the Australian Museum’s well-run research station, otherworldly diving, and beach barbecues. (“Man, I miss that island,” Dixson tweeted last year.)
Munday’s lab has been one of the engines of JCU’s success. Originally focused on competition in reef fish and their ability to switch sex, Munday shifted his attention to ocean acidification in the late 2000s. Dixson, who arrived at JCU in 2007, embraced the topic.
Acidification, which results as rising CO2 levels cause more of the gas to dissolve in the ocean, poses serious threats to ocean life, weakening corals and other organisms with carbonate shells or skeletons. But a 2009 paper in the Proceedings of the National Academy of Sciences (PNAS) on which Munday and Dixson were the first and second author, respectively, reported another troubling effect. When the pH of the water in fish tanks was lowered from 8.15, the current level in ocean water, to 7.8, the level expected by the end of the 21st century, larvae of the orange clownfish were less attracted by the chemical cues from a healthy reef—but more attracted to cues from grass and a pungent swamp tree whose smell normally repels them. That could cause clownfish to lose their ability to find suitable homes on the reef, the authors concluded.
Based on more lab experiments, Munday, Dixson, and other researchers later reported that high CO2 levels mess with fish minds in other ways as well: They become disoriented, hyperactive, and venture farther from shelter, for instance, while their vision and hearing deteriorate. For many of those studies, the scientists measured fish’s preferences by placing them in a flume, an apparatus that forces them to make a choice (see graphic, below). Water from two different sources flows into the flume, side by side, and researchers measure how much time the fish spend in water from either source.
Munday and Dixson often found unusually large effects from ocean acidification. In the PNAS paper, for example, the time orange clownfish spent on the foul-smelling side of the flume went from 0% to 80%. In a 2010 study in Ecology Letters, clownfish larvae reared in normal ocean water completely avoided chemical cues of two predator species, the small rockcod and the dottyback, but in more acidic water they spent 100% of their time around those predators’ scents—a “fatal attraction,” the authors said. A 2013 paper in Marine Biology reported that coral trout, an economically important species, became 90 times more active at a high CO2 level.
Dixson also used the flume for studies not related to ocean acidification. The Science paper the whistleblowers have challenged, published in 2014 while she was at the lab of Georgia Tech marine ecologist Mark Hay, showed that fish and coral larvae collected on Fiji’s coast are attracted by the chemical cues from healthy, protected reefs, but repelled by water from overfished reefs dominated by seaweeds. (It was, again, bad news, suggesting degraded reefs will have trouble recovering on their own because they’re unable to entice coral and fish larvae.)
Not long after that paper was published, Science received a “technical comment” from JCU reef ecologist Andrew Baird, who noted several problems, including the fact that the water flow quoted for Dixson’s flume was much faster than any coral larvae have been reported to swim, meaning larvae would be washed out of the back of the flume. Science’s review process deemed the comment “as low priority for publication,” says Deputy Editor for Research Sacha Vignieri. (Science’s news and editorial departments operate independently of each other.) Baird published it as a preprint instead, but it drew little attention.
The Fiji paper was Dixson’s second in Science—in 2012 she and Hay had reported some corals secrete chemical signals that “recruit” fish to trim toxic seaweeds—and it cemented her status as a rising star. In 2015, she left Georgia Tech to start her own lab in Delaware.
Clark, a fish physiologist, came to Townsville in 2011 to take a research job at the Australian Institute of Marine Science, a government laboratory on a cape 50 kilometers east of the city. Looking for new things to study, he says he started to read Dixson’s and Munday’s ocean acidification papers—and was struck by the large effect sizes. “I thought they were some of the most phenomenal findings in the whole discipline of biology,” he says. He set out to Lizard Island to repeat the work with predator cues, thinking he could unravel the physiology behind the phenomenon.
But he didn’t get the same results at all. Placed in the flume, fish would start to explore their surroundings, but they rarely had the strong preference for one side or the other that Dixson and Munday reported, and amping up the CO2 did not make a difference. Some fish were “terrified,” and didn’t move at all, says Fredrik Jutfelt of the Norwegian University of Science and Technology, who joined Clark for a season on Lizard Island in 2014, along with Sundin and several other scientists. “They’re taken out of their environment and placed in a highly unnatural situation,” Jutfelt says.
Clark was eager to publish the findings, but says, “We could think of quite a few methodological reasons that could be used to downplay our findings.” He and Sundin organized a second season on Lizard Island to re-create other studies. The researchers videotaped every experiment and used automated tracking software to monitor fish behavior as a way to rule out bias, which Munday and Dixson had not done.
The work was still in progress when, in June 2016, Science published a paper by marine ecologist Oona Lönnstedt, another successful JCU alum. After obtaining her Ph.D. in Townsville—where Munday was one of her three supervisors—Lönnstedt had returned to her native Sweden in 2014 and become a postdoc at Uppsala University (UU) to explore a new environmental threat: the tiny fragments of plastic waste known as microplastics that pollute many aquatic environments. In the 2016 Science paper, she and UU limnologist Peter Eklöv reported that perch larvae from the Baltic Sea had a penchant for swallowing microplastics instead of real food, which reduced their growth and—just like elevated CO2—disturbed their response to chemical cues, making them more likely to end up as fish food themselves.
Sundin, also at UU at the time, says she immediately thought the paper was “a total fantasy.” She and Jutfelt had spent time with Lönnstedt at a Baltic island research station, but had never seen a study on the scale described in the paper; many other details didn’t add up to them either. They decided to report their suspicions, with support from Clark, Roche, and other researchers they knew from Lizard Island.
The group learned some painful lessons. After a preliminary inquiry, a UU panel dismissed the request for an investigation in a terse report and berated the team for failing to discuss its concerns with Lönnstedt and Eklöv in a “normal scholarly discussion.” Lönnstedt said the group was simply jealous. The accusers spent many months gathering additional documentation, at the expense of their own research. In April 2017, Sweden’s Central Ethical Review Board concluded there had indeed been “scientific dishonesty” in the research, and Science retracted the paper; 8 months later, a full UU investigation concluded the data had been fabricated. (Eklöv blamed Lönnstedt; Lönnstedt maintained her innocence.)
The brazenness of the apparent deception shocked Jutfelt. “It really triggered my skepticism about science massively,” he says. “Before that paper, I could not understand how anyone could fabricate data. It was inconceivable to me.” Now, he began to wonder how many other papers might be a total fantasy. The experience also taught the group that, if they were ever to blow the whistle again, they would have to bring a stronger case right from the start, Clark says.
On the other side of the globe, the group’s accusations had Munday’s attention. “It seems that Clark and Jutfelt are trying to make a career out of criticizing other people’s work. I can only assume they don’t have enough good ideas of their own to fill in their time,” he wrote to Lönnstedt in a June 2016 email that she used in her defense to the ethics board. “Recently, I found out they have been ‘secretly’ doing work on the behavioural effects of high CO2 on coral reef fishes, presumably because they want to be critical of some aspects of our work.”
In January 2020, Nature published the Clark team’s findings: Elevated CO2 levels in water had a “negligible” effect on fish’s attraction to chemical cues from predators, their activity levels, and “lateralization”—their tendency to favor their left or right side in some behaviors. Based on a statistical procedure called a bootstrapping simulation, the team reported that Munday’s and Dixson’s data on chemical signal preference had a “0 out of 10,000” chance of being real. They left it to the reader to decide what to think about this.
As Clark had predicted, the ensuing debate focused on differences in methods. (It often does when replications fail, says psychologist Brian Nosek of the University of Virginia, a pioneer of the replication movement.) In a rebuttal published in Nature in October 2020, Munday, Dixson, and 11 other coauthors argued that the replication effort differed in at least 16 “crucial” points from the original studies, including the species used, the CO2 measurement methods, and the fact that some of the work was done during a 2016 heat wave on Lizard Island, which could dampen CO2’s effects. “They altered methods in critical ways that reduce the likelihood of detecting effects,” Munday said in his December 2020 Zoom seminar.
Clark acknowledges real differences in his team’s approach, sometimes necessary because the original papers were vague or the methodology as described didn’t work, he says. Other points were easy to counter, he says. The team did use several of the same species as Munday and Dixson, for example, and the heat wave did not affect the temperature in their fish tanks. And, he notes, “Massive effects in many species should not suddenly disappear because of small changes or improvements in the methodology.”
Sumpter agrees. The Munday and Dixson defense strategy was “I’ll overwhelm them with Christ-knows-how-many different reasons. That doesn’t seem to me the way to go about this,” he says. But the rebuttal convinced Bruno of the “bro-pocket” tweet. “I am shocked that Nature published a putative ‘repeatability’ study that didn’t even come close to mimicking the original science. Lame,” he tweeted. “I’d be amazed if Clark et al wasn’t retracted.”
Others defended Munday and Dixson more diplomatically. Four “grandfathers in the field,” as Bruno calls them, criticized the replication in a paper in Biogeosciences. One author, Hans-Otto Pörtner of the Alfred Wegener Institute in Bremerhaven, Germany, says his own work had been on the receiving end of criticism from the “youngish group” in the past. “Building a career on judging what other people did is not right,” says Pörtner, who co-chairs one of IPCC’s three working groups. “If such a controversy gets outside of the community, it’s harmful because the whole community loses credibility.”
Jutfelt says the team soon had reason to go even further. After the publication in Nature, they were approached by several former members of the Munday and Dixson labs eager to discuss their experiences. He and Clark also contacted former students themselves and collected written statements, eight of which are included in the request for the investigation they sent to the three funding agencies in August 2020. (All but one of former students and colleagues were willing to have their identities revealed to the funders, although several told Science they worried about retaliation.)
The statements about Munday’s lab don’t contain concrete fraud accusations, but those about Dixson do. One former colleague became “slowly more suspicious,” especially about the fluming studies, and began to monitor them covertly. The testimony contains text messages and photos of lab notebooks that appear to show Dixson did not spend enough time in the lab for one particular study and could not possibly have produced the data she reported.
Speaking to Science, this researcher said they left UD after asking officials there to investigate Dixson’s work. (A UD spokesperson says the university does not comment on personnel matters.) Two other former lab members say they, too, suspected Dixson of making up data.
One of the former lab members also started to examine raw data files for papers authored by Munday and Dixson, looking for irregularities. For most papers, the files were not available—many journals ask scientists to post their raw data, but don’t enforce that policy—but in two available files, they found several problems. Dixson’s 2014 Science paper—on which Munday was not an author—appears to have dozens of duplicated sets of numbers. For example, one column of 20 numbers appears at least six times in the Excel file—suggesting that the same experiment in 20 individuals in each of six species produced exactly the same results every time. In the past, scientists committing fraud are known to have duplicated data as an easy way to make them up.
Chris Hartgerink, a Dutch statistician who has a Ph.D. on detecting fraud in data sets and now heads a company named Liberate Science, confirmed the duplications in a review done at Science’s request and says they “would warrant an expression of concern at the least.” He would not immediately label them a sign of fraud, however. “It seems like something went deeply wrong in this research project, and that can be due to malfeasance, systematic negligence, or a wide range of other distorting factors,” Hartgerink says. Another “data detective” consulted for this story, Nicholas Brown, says the file provides “essentially indisputable evidence” that the data were made up. (Brown, who obtained a Ph.D. from the University of Groningen in the Netherlands in 2019, is retired and lives in Spain.)
Timothy Clark and colleagues say duplications in raw data files—such as these in activity measurements for two fish species in a 2014 paper—suggest fraud. Philip Munday says the block of 10 duplicated numbers is the result of a human error that he will correct.
Dixson, in the February interview, said she did not know about the allegations. Although she denies making up data, “There hypothetically could be an error in there,” she said, perhaps because of mistakes in transcribing the data; “I don’t know. I’m human.” Hay, the paper’s last author, says he’ll be “able to discuss what I know and my impressions once investigations are finalized,” but did not specify who is investigating. Vignieri, the Science editor, says she had not heard about the problems, but that universities, not the journal, would normally do an investigation. She agrees the sample sizes and the reported effects were large, but says “the two often go together and neither can be taken as worrying indicators, by default.”
Clark and colleagues also found problems in the data for the 2014 paper in Nature Climate Change, which showed fish behavior is altered near natural CO2 seeps off the coast of Papua New Guinea. (Munday was the first of five authors on the study, Dixson the third.) That data set also contained several blocks of identical measurements, although far fewer than in the Science paper. Ecologist Nicholas DiRienzo of the University of Arizona, who was consulted for this story, confirmed the duplications—and found additional ones that he calls “another strong indicator of fabrication.”
Munday says Dixson has recently provided him with one original data sheet for the study, which shows she made a mistake transcribing the measurements into the Excel file, explaining the largest set of duplications. “This is a simple human error, not fraud,” he says. Many other data points are similar because the methodology could yield only a limited combination of numbers, he says. Munday says he has sent Nature Climate Change an author correction but says the mistake does not affect the paper’s conclusions.
I’ve become much more aware of all of this. I think I have become a better scientist.
Brown, who decided to delve more deeply into the case on his own, identified problems of a different nature in two more Munday papers that had not been flagged as suspicious by the Clark team and on which Dixson was not an author. At about 20 places in a very large data file for another 2014 paper in Nature Climate Change, the raw data do not add up to total scores that appear a few columns farther to the right. And in a 2016 paper in Conservation Physiology, fractions that together should add up to exactly one often do not; instead the sum varies from 0.15 to 1.8.
Munday concedes that both data sets have problems as well, which he says are due to their first authors hand copying data into the Excel files. He says the files will be corrected and both journals notified. But Brown says the anomalies strongly suggest fabrication. No sensible scientist would calculate results manually and then enter the raw data and the totals—thousands of numbers in one case—into a spreadsheet, he says.
To him, the problems identified in the data sets also cast suspicions on the “ludicrous effect sizes” in many of the 22 papers flagged by the whistleblowers. “Suppose you’re going to the house of somebody you think may have been handling stolen televisions, and you found 22 brand new televisions in his basement, and three had serial numbers that corresponded to ones that have been stolen from shops,” Brown says. “Are you going to say, ‘Yeah, we’ll assume you’ve got the purchase receipts for the other 19?’” Hartgerink is more cautious. “I did find patterns of recurring issues across the 22 papers, which may prove problematic and warrant further investigation,” he wrote in his report for Science, “but it is not possible to establish intent, motivation, or origin of these issues.”
JCU says it has dismissed the case brought by the Clark group following the advice of an external investigator. The problem in the data on fish living near the CO2 seeps constituted only a “minor breach” of JCU’s Research Code that Munday has offered an explanation for, a spokesperson says.
Clark and his colleagues say they are not surprised. Years ago, neither UU nor JCU responded adequately to the concerns about Lönnstedt’s work, they say. “Universities have a big conflict of interest,” Sundin says, echoing the experience of many other whistleblowers. “Thankfully, Sweden had a Central Ethical Review Board,” Clark says. “Australia has no such board.”
The group now worries that if Dixson is found to have fabricated data, Munday will escape responsibility despite being her supervisor during part of her work. That would be reminiscent of the microplastics case, in which Eklöv bore part of the responsibility, according to the second UU review panel, but was not found guilty of fabrication and kept his job while Lönnstedt lost hers. The outcome would be doubly sad because Lönnstedt and Dixson served as role models in a field where women are underrepresented, says Sandra Binning of the University of Montreal, a co-author of the Nature replication: “That’s not lost on me.”
Since Munday and Dixson began to publish on ocean acidification in 2009, their work has spawned a minifield that drew in dozens of other researchers. Many—even members of the Clark group, in earlier work—have reported seeing changes in how fish behave, although rarely as dramatic as those Munday and Dixson claimed. Researchers also tried to unravel the physiological and neural underpinnings of the behavioral changes. In the rebuttal in Nature, Munday noted that 85 papers, with more than 180 co-authors from more than 90 institutions, have by now reported effects from elevated CO2. As Dixson asked in a tweet criticizing the replication effort, “Can one study claim to overturn 11 years of research?”
Jutfelt says the reality is more complex. Of those 85 papers, 43 were co-authored by Munday, he notes. More important, unconscious and conscious bias may have played a role, he says. Many studies that did report an effect, including his own, weren’t blinded. Researchers knew which fish were exposed to high CO2 levels, and they knew what fish were expected to do under those circumstances. Some of his own past studies, including one that found strong effects of CO2 in sticklebacks in Sweden, were “poorly designed,” Jutfelt admits. “I’ve become much more aware of all of this. I think I have become a better scientist,” he says, pointing to the automated analyses in the replication effort.
Sumpter is also not surprised so many published studies went in the same direction. He says his own field, ecotoxicology, is rife with small studies with large effect sizes; they’re easy to publish. “But if you claim that chemical X doesn’t seem to do very much at relevant concentrations, it’s quite likely that a journal editor will just bat it right back to you, saying, without using quite these words: not really exciting.”
Still, the reported effects of CO2 on fish behavior and ecology have seemed to fade as the years passed. A recent meta-analysis of 95 papers by Jeff Clements of Fisheries and Oceans Canada, published as a preprint with Jutfelt, Sundin, and Clark, showed the field is experiencing a strong “decline effect,” a phenomenon where, after dramatic initial findings, reported effects become smaller and smaller.
Munday, in his December seminar, acknowledged that later studies had identified “mitigating factors” that dampened the alarming impacts seen in his lab’s early experiments. “The effects that we would predict now we have this additional knowledge from a decade of research,” he said, “would be much less than what we would have predicted when the very first studies were published.” Time may tell whether those effects exist in the first place.
This story was supported by the Science Fund for Investigative Reporting.