SEIZE THE $50 BILLION SITE-OF-CARE SHIFT OPPORTUNITY
Get the tools, data, and insights to drive growth.
Learn more
WHAT THE 2024 ELECTIONS MEAN FOR HEALTHCARE
Get the latest news and insights from our experts.
Learn more

Library

| Daily Briefing

The 'replication crisis' deepens: 8 more high-profile psychology experiments just got debunked


A new paper published in Nature adds to a growing pile of evidence that the field of psychology is experiencing a "replication crisis," identifying several more high-profile studies from psychology's history that could not be replicated in new experiments.

Still, there is some good news: The paper also demonstrated that researchers were able to "sniff out" past studies that had produced "dubious results with remarkable skill," according to NPR's "Shots."

 


Are you leading an evidence-based organization?

 

About psychology's 'replication crisis'

The so-called "replication crisis" gained attention around 2010 after a paper demonstrated an "impossible" finding—that people could perceive the future—using "completely accepted" research practices, Vox reports.

That led researchers to consider whether those "accepted" practices truly represented good science, and ultimately it led researchers to try to redo past experiments using more rigid practices, such as using increased sample sizes and preregistered study designs.

Thus far, such efforts have uncovered many landmark psychology studies that didn't stand up to scrutiny. For instance, in 2015, a group of psychologists tried to replicate 100 experiments—and found only about 40% of the results could be replicated.

A new paper delves into experiments from the highest-profile journals

To further determine which past studies might have produced unreliable results, researchers for the latest paper sought to replicate experiments from 21 social science papers originally published from 2010 to 2015 in Science and Nature, two of the most prestigious journals.

Brian Nosek, one of the paper's authors, a psychology researcher at the University of Virginia, and the executive director of the Center for Open Science, explained, "Some people have hypothesized that, because [Science and Nature are] the most prominent outlets, they'd have the highest rigor. Others have hypothesized that the most prestigious outlets are also the ones that are most likely to select for very 'sexy' findings, and so may be actually less reproducible."

The researchers tried to replicate one experiment from each of the 21 published papers. To reduce the likelihood of finding an effect due solely to random chance, they increased the experiments' sample sizes, ultimately recruiting five times as many volunteers as in the initial experiments.

According to Vox, the study had limitations, including that the researchers did not replicate every experiment conducted in the papers. As such, Vox reports, the study's results do not necessarily mean the theories behind the original study are incorrect.

Before they conducted the replications, the researchers also surveyed about 200 social scientists, asking them to predict which findings would be replicable.

 

 

What the new paper found

According to the researchers, they could not replicate the results for eight of the 21 original experiments. Even for the 13 experiments that successfully reproduced, however, researchers found effects that were on average only about 50% as strong as those identified in the original experiments.

Felix Holzmeister, one of the paper's authors and an Austrian economist, said the "study shows statistically significant scientific findings should be interpreted rather cautiously until they have been replicated, even if they have been published in the most renowned journals."

The good news is that researchers also found that the 200 social scientists they surveyed demonstrated a strong skill at identifying in advance which experiments would not replicate. That suggests, "Shots" reports, that in many cases social scientists can identify weak studies without undertaking a full-scale reproduction.

Why the replication crisis persists

If psychology is aware of this replication crisis, why are studies that fall short still being published?

Nosek said that even though social scientists may be able to detect a potentially flawed study, peer reviewers and editors might even so choose to publish a study for a number of reasons. For example, Nosek said, "It may be that this finding isn't likely to be true, but if it is true, it is super important, so we do want to publish it because we want to get it into the conversation."

Anna Dreber, a coauthor of the study and an economics professor at the Stockholm School of Economics, was encouraged by the researchers' skill in predicting which studies could be replicated. That's "great news for science," she said, because it means social scientists could help detect research flaws and prevent researchers from "spend[ing] lots of time and energy and money on results that turn out not to hold."  She said, "[T]hat's kind of wasteful for resources and inefficient, so the sooner we find out that a result doesn't hold, the better."

Nosek added, "The social-behavioral sciences are in the midst of a reformation," and social scientists are beginning to be more transparent about their research so that peers can check the accuracy of their conclusions (Harris, "Shots,NPR, 8/27; Resnik, Vox, 8/27).

Sept. 6 webconference: How AI can improve your clinical and financial outcomes

 

Join us next Thursday, Sept. 6 from 3-3:30 pm ET to explore how new artificial intelligence (AI) technologies will play a role in improving clinical and financial outcomes.

Register Now


SPONSORED BY

INTENDED AUDIENCE

AFTER YOU READ THIS

AUTHORS

TOPICS

INDUSTRY SECTORS

Don't miss out on the latest Advisory Board insights

Create your free account to access 1 resource, including the latest research and webinars.

Want access without creating an account?

   

You have 1 free members-only resource remaining this month.

1 free members-only resources remaining

1 free members-only resources remaining

You've reached your limit of free insights

Become a member to access all of Advisory Board's resources, events, and experts

Never miss out on the latest innovative health care content tailored to you.

Benefits include:

Unlimited access to research and resources
Member-only access to events and trainings
Expert-led consultation and facilitation
The latest content delivered to your inbox

You've reached your limit of free insights

Become a member to access all of Advisory Board's resources, events, and experts

Never miss out on the latest innovative health care content tailored to you.

Benefits include:

Unlimited access to research and resources
Member-only access to events and trainings
Expert-led consultation and facilitation
The latest content delivered to your inbox
AB
Thank you! Your updates have been made successfully.
Oh no! There was a problem with your request.
Error in form submission. Please try again.