The Psychologist Busted For Faking At Least 55 Experiments

The Psychologist Busted For Faking At Least 55 Experiments

April 28, 2013 10:58 am 0 comments

The New York Times reports on Diederik Stapel, psychology’s most notorious fraudster. But the problems with psychology — and science — go far beyond Stapel’s deception:

At the end of November, the universities unveiled their final report at a joint news conference: Stapel had committed fraud in at least 55 of his papers, as well as in 10 Ph.D. dissertations written by his students. The students were not culpable, even though their work was now tarnished. The field of psychology was indicted, too, with a finding that Stapel’s fraud went undetected for so long because of “a general culture of careless, selective and uncritical handling of research and data.” If Stapel was solely to blame for making stuff up, the report stated, his peers, journal editors and reviewers of the field’s top journals were to blame for letting him get away with it. The committees identified several practices as “sloppy science” — misuse of statistics, ignoring of data that do not conform to a desired hypothesis and the pursuit of a compelling story no matter how scientifically unsupported it may be. [...]

Fraud like Stapel’s — brazen and careless in hindsight — might represent a lesser threat to the integrity of science than the massaging of data and selective reporting of experiments. The young professor who backed the two student whistle-blowers told me that tweaking results — like stopping data collection once the results confirm a hypothesis — is a common practice. “I could certainly see that if you do it in more subtle ways, it’s more difficult to detect,” Ap Dijksterhuis, one of the Netherlands’ best known psychologists, told me. He added that the field was making a sustained effort to remedy the problems that have been brought to light by Stapel’s fraud.

Full Story: The New York Times: The Mind of a Con Man

This is the sort of thing that led to the foundation of the The Reproducibility Project, which aims to verify studies published in Psychological Science, the Journal of Personality and Social Psychology and the Journal of Experimental Psychology: Learning, Memory, and Cognition in 2008.

One thing not really discussed is the difficulty of funding for experiments. I have a story coming out on Wired.com tomorrow that deals in part with the state of scientific research, including how hard it is to get funding for research that has the chance of failing. Researchers are spending more time chasing funding than doing research, and the stakes are quite high to find positive results and to publish. Then there’s also, as alluded to above, the ever present publication bias. Update: here’s the Wired story.

Leave a reply

You must be logged in to post a comment.