Addressing Scientific Fraud

AN INTERIM REPORT RELEASED IN OCTOBER 2011 BY TILBURG UNIVERSITY, NETHERLANDS, concluded that one of its faculty members, social psychologist Diederik Stapel, fabricated data for numerous studies conducted over a period of 15 to 20 years.  The good news, of course, is that the fraud was eventually uncovered.  The bad news is that it went undetected for so long and involved so many scientific articles — over 100 publications are now under investigation.  The costs of the fraud for the careers of young scientists and others who worked with him, for science, and for public trust in science are devastating. 

by Jennifer Crocker and M. Lynne Cooper
Science (PDF)

As the investigation unfolds, the moment is opportune to reflect on what can be done to protect science and the public from fraud in the future.  Scientists generally trust that fabrication will be uncovered when other scientists cannot replicate (and therefore fail to validate) findings. In this particular case, however, reliance on replication as the first line of defense did not work. Why? Social psychologists, like other scientists, value novel contributions. Despite the need for reproducible results to drive progress, studies that replicate (or fail to replicate) others’ findings are almost impossible to publish in top scientific journals. This disincentive means fraud can go undetected, which was the case with Stapel. The peer-review system is another possible line of defense, but it is not designed to catch cheaters. The American Psychological Association (APA) began using an electronic manuscript tracking system in 2003. Since then, Stapel submitted 40 manuscripts to APA journals; 16 were rejected and 24 were accepted. This creates a sufficient body of work that one might expect irregularities to be detected.  However, the 40 initially submitted manuscripts were handled and processed through the peer review system by 25 different editors. Under such circumstances, it would be almost impossible to detect a pattern of data fabrication.

What did work in this case, and perhaps in most cases where fraud is detected, is that people close to the perpetrator developed suspicions and came forward. According to the university committee’s interim report, other researchers had raised questions several times, but their concerns were not followed up. In the end, six junior researchers had the courage — and it does take courage — to gather the evidence and report it.  It’s risky for whistleblowers to come forward, and difficult for authorities to respond appropriately, because students, colleagues, and universities have so much to lose when fraud is alleged.

The Dutch universities involved in this case intend to thoroughly investigate everything that Stapel published in his career.  Their goal is to clean up the entire scientific record.  This intent contrasts with Harvard University’s response to the fraudulent work published by one of its faculty, Mark Hauser. In that case, Harvard limited its investigation to specifically challenged papers and has kept its findings confidential. For the sake of science, when fraud is uncovered, the field needs to know exactly which studies are based on falsified data.

Scientists in the field of social psychology must explore what they can do to prevent fraud in the future. Greater transparency with data, including depositing data in repositories where they can be accessed by other scientists (as is done in some other fields), might have sped up detection of this fraud, and it would certainly make researchers more careful about the analyses that they publish. Although many social psychologists are reluctant to share their data, fearing that

their analyses will be criticized or they will be scooped, increasing transparency in this way is important. The zeitgeist around replication must also change, because replication is the cornerstone of a cumulative science. Thus, the field of social psychology needs to develop policies that facilitate and encourage systematic replication. And in all of the sciences, discussing issues related to data replication should become part of student training, along with developing better systems for reporting suspected misconduct or fraud.

See also:

PENG, “Reproducible Research in Computational Science” Science (2011) Vol. 334, No. 6060, pp. 1226-1227

SANTER, “The Reproducibility of Observational Estimates of Surface and Atmospheric Temperature Change” Science (2011) Vol. No. 334, pp. 1232-1233

IOANNIDIS, “Improving Validation Practices in “Omics” Research,” Science (2011) Vol. 334, No. 6060, pp. 1230-1232

Tags: ,

You must be logged in to comment

Log in