Read our COVID-19 research and news.

Rotterdam Marketing Psychologist Resigns After University Investigates His Data

Clever statistical sleuthing by an anonymous fraud hunter in the United States appears to have led to the downfall of a marketing researcher at Erasmus University Rotterdam in the Netherlands. Today, the university announced in a statement (Dutch) that Belgian-born social psychologist Dirk Smeesters, who specialized in consumer behavior, resigned effective 21 June after an investigative panel found problems in his studies and concluded it had "no confidence in [their] scientific integrity." The university has also asked for the retraction of two of Smeesters' papers, one published in the Journal of Personality and Social Psychology in January and the other in the Journal of Experimental Social Psychology last year.

Smeesters, a professor at the Rotterdam School of Management, could not be reached for comment today, but the university panel's report (Dutch), provided by a university spokesperson, says he conceded to "massaging" the data in some papers to "strengthen" outcomes, while defending his actions as common in his field. The case seems certain to further undermine confidence in social psychology, a field struggling to show that its findings are reproducible, and comes while the Dutch academic world is still recovering from the affair involving social psychologist Diederik Stapel, who made up data for dozens of papers according to investigative panels. Smeesters has worked at Tilburg University, Stapel's academic home, for several years but the two did not collaborate, and the cases appear to be unrelated. There are several parallels, however.

Like Stapel, Smeesters led a number of high-profile studies, which, as Smeesters noted on his home page, were covered by many international news outlets. Some of his catchy research topics were whether models that look like the girl-next-door might be better than Kate Moss, the effects of messiness (also a topic Stapel explored), and whether death-related media stories might make consumers prefer domestic brands. Like Stapel, Smeesters often collected and analyzed his data alone, even when collaborating with other researchers who helped design the studies, according to the university panel's report.

The case came to light after a whistleblower analyzed one of Smeesters's published papers and found that the data were "too good to be true," according to the panel. The whistleblower contacted Smeesters himself last year, the report says; Smeesters sent him a data file, which didn't convince his accuser. On 30 November 2011, Smeesters himself asked for an appointment with a special university counselor to whom staff and students can report suspicions of misconduct, the report says. It doesn't say what Smeesters hoped to achieve, but the appointment, initially set for 7 February, was later canceled and replaced by an interview with an investigative commission.

In its report sent to ScienceInsider, the whistleblower's name is redacted, as are most details about his method and names of Smeesters's collaborators and others who were involved. (Even the panel members' names are blacked out, but a university spokesperson says that was a mistake.) The whistleblower, a U.S. scientist, used a new and unpublished statistical method to search for suspicious patterns in the data, the spokesperson says, and agreed to share details about it provided that the method and his identity remain under wraps. "If he wants to publish his findings in a journal, the results shouldn't be out on the street in Rotterdam," the spokesperson says.

The investigating panel asked two statistical experts to analyze the method; after concluding it was "valid," it took a close look at the papers co-authored by Smeesters—including those still under review— for which he had control over the data. The statistical method could be applied to a total of 22 experiments; of those, three experiments were problematic. Those experiments were described in the two papers now up for retraction and a third that had been submitted but not yet published, says the spokesperson.

The panel doesn't comment on the veracity of the remaining papers. Smeesters gave the group a series of data files, but because of time constraints, the committee examined only those pertaining to the two papers already published. In those files, the panel "discovered patterns that ranged from remarkable to extremely unlikely."

Smeesters conceded to employing the so-called "blue-dot technique," in which subjects who have apparently not read study instructions carefully are identified and excluded from analysis if it helps bolster the outcome. According to the report, Smeesters said this type of massaging was nothing out of the ordinary. He "repeatedly indicates that the culture in his field and his department is such that he does not feel personally responsible, and is convinced that in the area of marketing and (to a lesser extent) social psychology, many consciously leave out data to reach significance without saying so."

But the university panel goes on to say that it can't determine whether the numbers Smeesters says he massaged existed at all. He could not supply raw data for the three problematic experiments; they had been stored on a computer at his home that had crashed in September 2011 and whose data his brother-in-law had assured him were irretrievable. In addition, the "paper-and-pencil data" had also been lost when Smeesters moved his office at the school. The panel says it cannot establish Smeesters committed fraud, but says he is responsible for the loss of the raw data and their massaging.

One of the two papers that the university says will be retracted was written with Jia Liu of University of Groningen in the Netherlands; the other with Camille Johnson of San Jose State University in California and Christian Wheeler at Stanford University. None of these researchers responded to e-mails and voice messages left by ScienceInsider today. The Erasmus University Rotterdam statement today said that there is "no reason whatsoever to question the co-authors' good faith." The investigative report notes that Smeesters would usually find collaborators by approaching them at meetings, "during which Smeesters indicated he had access to an excellent lab with a subject pool, allowing him to take care of data collection easily."

Smeesters' Ph.D. students never had any doubts about his integrity, according to the commission; the allegations "came as a complete surprise to them."

*This item has been updated on 29 June. A previous version of this story erroneously said Smeesters claimed to have lost the data when he moved house, instead of his office at school.