BY NIKOLA STIKOV
One of the newest initiatives of the OHBM is the establishment of a replication award to highlight the Organization’s commitment to reproducibility and transparency in neuroimaging research. The OHBM Replication Award will recognize the best replication study of the past year. The 2017 award is generously supported by the Laura and John Arnold Foundation.
Continuing with the open science coverage on this blog, I interviewed Chris Gorgolewski at the Center for Reproducible Neuroscience at Stanford University, to discuss the rules and implications of this new initiative.
Nikola Stikov (NS): First of all, what is a replication study?
Chris Gorgolewski (CG): A replication study is a repetition of a published study procedure with minor changes to variables assumed not to be important for the measured phenomena (this depends on the experiment, but could include demographics, scanner model, visual stimuli delivery system, analysis strategy, etc.). Replication studies usually (but not always) have a larger sample size than the original study for appropriate statistical power, and are performed by a different team than the original study (but planning of a replication study can benefit from involvement of the original researchers). Even though minor changes between the original study and its replication are inevitable they should be minimized as much as possible.
NS: What about methodological replications? Could a study applying different data processing streams to the same data (in contrast to acquiring new data) be eligible for the award?
CG: Yes, such studies should be considered as a form of a replication and will be eligible for the award. Since there is a lot of variability in how important methodological replications vs traditional ones are, the impact of such submissions will have to be evaluated by the judges on a case by case manner.
NS: What are the criteria used to choose the best paper?
CG: Each paper will be evaluated according along two dimensions: quality of the replication attempt and importance of evaluated finding. There are several factors that can improve the quality of a replication study: preregistration (especially if the registration was first evaluated by the researchers who designed the original study), sample size (and thus statistical power), transparency (publication of code and data), and lack of conflicts of interest. The importance of the evaluated finding rests on the degree to which it answers an interesting and important question. For example, findings that are a basis for a whole new branch of neuroimaging, challenge existing models of cognition, or are basis for policy changes in context of mental health care should be considered more important and worthwhile replicating. Admittedly, the second criterion is very subjective, but we are confident that the jury will do a good job evaluating all of the submissions.
NS: So does every replication need to be preregistered and fully open?
CG: Not necessarily. We wouldn’t discredit studies that choose not be fully transparent (and not share code or data), or did not preregister their methods. After all, even a non-preregistered replication attempt with closed code and data is a valuable contribution to scientific knowledge. Having said that, if I was presented with two identically powered replication studies of which one was preregistered and shared data and the other did not, I would personally have greater trust in the more transparent of the two.
NS: You mentioned “replication attempt”. Are failed replications also eligible for the award?
CG: Absolutely yes! Replication studies are meant as an accumulation of knowledge, and both null as well as statistically significant results contribute to our understanding of a given phenomenon. For example a well powered failed replication challenging an important study can be very valuable in preventing the field from researching a “dead end”.
NS: Are researchers allowed to nominate their own paper or does someone else have to do it?
CG: Self-nominations are perfectly fine.
NS: How about old replication studies, are they eligible?
CG: Yes. For this year’s first edition (2016), there are no time restrictions in terms of recency. This might change in the following years (limiting the award just to papers published in the previous year).
NS: Is there enough time to submit for people that just found out about the award? Getting reviews and resubmitting revisions of a replication paper will take at least half a year.
CG: Preprints that did not yet undergo a formal peer review process are perfectly acceptable submissions for the replication award, so you don’t need to wait until your paper gets accepted. Furthermore the submission deadline has been pushed to 22nd of February 2017.
NS: Can scientists reuse old data collected in their lab to perform a replication study?
CG: Of course! In fact I expect most labs are sitting on a wealth of replication data that was never published. All it takes to be eligible for the OHBM replication award is to write it up as a preprint and apply.
NS: You said that for the award preprints are sufficient, but which journals are likely to accept such a study for publication?
CG: PloS, Frontiers and Nature Scientific Reports seem like good bets, as they do not use “impact” as a criterion of acceptance. NeuroImage: Clinical should also be happy to accept replication studies, given it made an explicit editorial call for them. Cortex supports a Registered Reports article type which guarantees publication of your results independent of the outcome of the experiment given they first accept your preregistration report. This mechanism might be very useful for replications (since writing a preregistration plan for a replication is easier than for a standard study). There are probably more journals happy to publish replications - you just need to try!
NS: How was the idea for the OHBM Replication Award conceived?
CG: It was proposed by Russell Poldrack, Jean-Baptiste Poline, David Kennedy, Thomas Nichols and myself.
NS: What is the process to nominate a paper for the award?
CG: Just send a link to the paper/preprint you are nominating together with a short paragraph justifying your nomination to firstname.lastname@example.org.
NS: Chris, thank you so much for answering so many questions about this new award. We look forward to seeing the impact of recognizing reproducible results in neuroimaging research!
You can find more information about the OHBM Replication Award here.
Your comment will be posted after it is approved.
Leave a Reply.