As It Happens

Researchers can't replicate their own work and do a rare thing: publish their failure

It's a big deal for researchers to have their studies published in scientific journals, and also to have their work replicated by other scientists. Without that, their research is still unproven. Psychologist William Horton did something scientists rarely want to do: he tested his own findings -- and failed to replicate it....

It's a big deal for researchers to have their studies published in scientific journals, and also to have their work replicated by other scientists. Without that, their research is still unproven. Psychologist William Horton did something scientists rarely want to do: he tested his own findings -- and failed to replicate it.

Rather than bury his flawed study, he published his failure in the journal PLOS One.

"My first reaction was, of course, disappointment," Professor Horton tells As It Happens guest host Helen Mann. "You put a lot of effort into the research that you do, a lot of investment of time and energy, and you work very hard."

The original study, which he co-authored, was titled "The influence of partner-specific memory associations on language production: Evidence from picture naming."

"When you replicate a study, the idea is that you may not find that same pattern if the result is indeed not real," he says. "That seemed to be what happened the second time around."

The decision to publish the replication failure, he acknowledges, is "unusual" in the scientific community.

"It really does not necessarily put you in the best light, so it is kind of risky and I think people are impressed that we would go ahead and take the risk."

In particular, public reaction has been intense. A link to an article about the replication failure was even featured on the front page of Reddit, featuring hundreds of (mostly) supportive comments.

"I think there's a lot of skepticism out there or a lot of concern out there about the possibility of publication bias. Journals and editors are, of course, looking for novel, exciting results and that's what drives the field forward and that's what helps sustain careers."

He continues with his thoughts on publishing studies about scientific failure: "I certainly do not think it should be [seen as] noble. I think it should be part of the standard practice, that you have a result and in order to have confidence in the result, you want to know whether or not it replicates. I think there's a growing recognition that this kind of thing should be more common."