There is growing awareness of reproducibility concerns within the scientific literature; whilst journals continue to be filled with seemingly beautiful results, teams of researchers are lifting the lid on the veracity of scientific claims and the replicability of findings in a growing number of disciplines. Highlighting the global scale of this issue, the Open Science Collaboration (Citation2015) performed direct replications of 100 psychological research studies and found that whilst 97% of the original studies reported significant results, this declined to 36% for replications of these. Moreover, the average effect size of the replications was half the magnitude of the average across original studies. Several human decision-making factors have been proposed to underlie irreproducibility, such as selective reporting and questionable research practices (QRPs; Simmons et al. Citation2011; John et al. Citation2012) fueled by pressures to achieve novel, surprising results and incentive structures that reward quantity over quality (Giner-Sorolla Citation2012; Frith Citation2020; Munafò et al. Citation2020). As can be seen in , there are many threats that can arise at each stage of the hypothetico-deductive scientific process, which can undermine the robustness of published research and constrain knowledge accumulation. In their ‘manifesto for reproducible science’, Munafò et al. (Citation2017) propose five measures that target these threats with the goal of improving research: (1) methods, (2) reporting and dissemination, (3) reproducibility, (4) evaluation, and (5) incentives. Adoption of open science practices will allow many of these measures to be achieved and calls for their implementation in addiction science are becoming louder (Heirene Citation2021; Louderback et al. Citation2021; Pennington et al. Citation2021). However, to be successful, academic journals also need to adapt and recentre their focus on scientific rigor rather than the metamorphosis of study results. In this Editorial, we introduce two types of Registered Reports (RRs) that satisfy all these measures and highlight our commitment by offering these publishing tracks to authors through ART. We first outline the ‘traditional’ RR format (which ART has offered since 2019) and its associated advantages, followed by the introduction of a new initiative launched in April 2021 – Peer Community In Registered Reports (PCI RR).
Traditional RRs
RRs represent a publishing model whereby initial peer review is performed before a study is conducted or analyses of secondary data are initiated (Chambers Citation2013; Nosek & Lakens Citation2014). RRs differ from study preregistration in that the study protocol is submitted to a journal rather than solely to a registry. Whilst study preregistration has many advantages, such as clearly distinguishing hypothesis testing ('prediction') from hypothesis generation (‘postdiction’; Nosek et al. Citation2018), recent research suggests that it requires careful oversight to protect against selective reporting and undisclosed discrepancies (see Vassar et al. Citation2019; Bakker et al. Citation2020). RRs help prevent these issues from occurring in the first place by splitting the peer-review process into two stages: pre-study (Stage 1) and post-study review (Stage 2). At Stage 1, the authors submit their introduction, methods and analysis plan to a journal and expert peer-reviewers then assess the validity of the research questions, the robustness of the proposed methodology, and the rigor of the planned analyses. Following detailed review and revision, proposals that are favorably assessed are issued an ‘In Principle Acceptance’ (IPA). This means that the journal commits to publishing the final article regardless of whether the hypotheses are supported, so long as the authors adhere to their approved protocol and the journal’s formatting requirements. Upon receiving IPA, authors then typically register their protocol in a repository (e.g. via the Open Science Framework). After finalizing their study, the authors then submit a Stage 2 manuscript that includes their original Stage 1 protocol with the results and discussion now affixed (Chambers and Tzavella Citation2020; see ). Importantly, at this stage confirmatory and exploratory analyses are distinguished and reviewers cannot reject the article on the basis of the study’s results. Through more rigorous bias control, RRs help to create more valid findings in the general literature to maximize trust in scientific research.
RRs therefore improve (1) methods by allowing peer-reviewers to provide valuable feedback on the study protocol during the crucial planning stages, (2) reporting and dissemination by ensuring that the accepted protocol is followed and results are interpreted appropriately, (3) reproducibility by reducing questionable research practices and requiring open data and/or code, (4) evaluation by mitigating the outmoded emphasis on study results to dissipate publication bias, and (5) incentives by restructuring the publication process to give authors control and rewarding open science practices. RRs also champion replication attempts, which are beginning to become more mainstream in psychological and other sciences but are largely absent within the addiction literature (Heirene Citation2021). Emerging findings attest to the value of RRs: null findings are approximately five times more likely in RRs relative to regular articles, suggesting that they reduce publication bias and/or QRPs (Allen and Mehler Citation2019; Scheel et al. Citation2021). Furthermore, RRs enhance study quality (Soderberg et al. Citation2020), enact higher levels of open data and computational reproducibility (Obels et al. Citation2020), and attract higher altmetric scores and similar-to-slightly greater numbers of citations compared to the general (non-RR) literature (Hummer et al. Citation2017). Despite their substantial benefits to both researchers and journals, ART represents one of only four addiction journals offering this publishing format to date (see Gorman Citation2019; Pennington et al. Citation2021). As well as our journal guidelines, there are several practical guides for authors of RRs (Kiyonaga and Scimeca Citation2019; Stewart et al. Citation2020; Center for Open Science Citation2021).
Peer community in registered reports
A new type of RR initiative is publisher-independent and known as Peer Community In Registered Reports (PCI RR). Launched in April 2021, PCI RR is a community-driven initiative dedicated to reviewing and recommending RRs across the full spectrum of STEM, medicine, social sciences, and the humanities. Through this route the Stage 1 protocol and Stage 2 manuscript are submitted for consideration by PCI RR and reviewed independently of journals by expert peer-reviewers. The review process is overseen by accredited ‘Recommenders’, who must pass an entrance test to demonstrate an in-depth understanding of RRs and ensure high quality review and decision-making. Following a favorable recommendation at each stage of the process, the manuscript is then posted at a preprint server and the peer reviews and recommendation of the preprint are made openly available via the PCI RR website. outlines the full process of the PCI RR track. Following completion of the peer review process, authors of recommended RRs have the option to publish in a growing list of ‘PCI RR-friendly’ journals that commit to accepting the recommendation without the need for additional peer review. Authors are informed at the point of Stage 1 in-principle acceptance (IPA) which PCI RR-friendly journals are eligible outlets. All eligible journals then automatically offer Stage 1 IPA without requiring authors to submit their Stage 1 manuscript. Authors who intend to publish their RR in a PCI RR-friendly journal need not submit their manuscript to a journal until after the final positive Stage 2 recommendation. The novelty of this initiative is that it puts authors in control: they choose the destination journal.
In support of this initiative, ART has signed up as a ‘PCI RR friendly’ journal, which means that we accept Stage 2 manuscripts that have received a positive final recommendation through PCI RR. To guarantee acceptance, authors need to ensure that their RR meets the scope of the journal and the journal requirements concerning bias control and formatting. It is our view that both traditional and PCI RRs will help reshape the publication process and contribute to a more representative literature, and we look forward to seeing their success.
Conclusions
Academic publishing is changing, and ART is changing with it. In addition to publishing non-RR articles, we now offer two distinct RR tracks: authors can submit articles for pre-study peer-review directly to ART through the traditional RR route, or take advantage of guaranteed acceptance in ART without additional peer review once their study has been recommended through the PCI RR initiative. We hope that these developments will contribute to addiction science more actively adopting open science practices and help mitigate reproducibility concerns within the published literature. If you are interested in submitting through the RR track, please consult our instructions for authors.
Disclosure statement
Charlotte Pennington is a Handling Editor at Addiction Research & Theory with a particular brief to support ART’s commitment to open science by overseeing Registered Reports. She is also a Recommender of PCI RRs and is the Network Lead for the UK Reproducibility Network (UKRN) for Aston University. Derek Heim is the Editor in Chief of ART and has initiated both traditional RRs and registered ART as a ‘PCI RR friendly’ journal.
References
- Allen C, Mehler DMA. 2019. Open science challenges, benefits and tips in early career and beyond. PLOS Biol. 17(5):e3000246.
- Bakker M, Veldkamp CLS, van Assen MALM, Crompvoets EA, Ong HH, Nosek BA, Mellor D. 2020. Ensuring the quality and specificity of preregistrations. PLOS Biol. 18:e3000937.
- Centre for Open Science. 2021 Apr 23. Registered reports. Charlottesville: Centre for Open Science. https://www.cos.io/initiatives/registered-reports.
- Chambers CD. 2013. Registered reports: a new publishing initiative at Cortex. Cortex 49:609–610. https://doi.org/https://doi.org/10.1016/j.cortex.2012.12.016
- Chambers CD, Tzavella L. 2020 Feb 10. The past, present, and future of Registered Reports. https://osf.io/preprints/metaarxiv/43298/
- Frith U. 2020. Fast lane to slow science. Trends Cognitive Sci. 24:1–2.
- Giner-Sorolla R. 2012. Science or art? How aesthetic standards grease the way through the publication bottleneck but undermine science. Perspect Psychol Sci. 7:562–571.
- Gorman DM. 2019. Use of publication procedures to improve research integrity by addiction journals. Addiction. 114:1478–1486.
- Heirene RM. 2021. A call for replications of addiction research: which studies should we replicate and what constitutes a ‘successful’ replication? Addiction Res Theory. 29(2):89–89.
- Hummer LT, Singleton Thorn F, Nosek BA, Errington TM. 2017. Evaluating registered reports: a naturalistic comparative study of article impact. https://osf.io/5y8w7/
- John LK, Lowenstein G, Prelec D. 2012. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol Sci. 23:524–535.
- Kiyonaga A, Scimeca JM. 2019. Practical considerations for navigating registered reports. Trends Neurosci. 42:568–572.
- Louderback ER, Wohl MJ, LaPlante DA. 2021. Integrating open science practices into recommendations for accepting gambling industry research funding. Addict Res Theory. 29(1):79–79.
- Munafò MR, Chambers CD, Collins AM, Fortunato L, Macleod MR. 2020. Research culture and reproducibility. Trends Cognitive Sci. 24:91–93.
- Munafò MR, Nosek BA, Bishop DV, Button KS, Chambers CD, Du Sert NP, Simonsohn U, Wagenmakers EJ, Ware JJ, Ioannidis JP. 2017. A manifesto for reproducible science. Nat Hum Behav. 1(1):0021.
- Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. 2018. The preregistration revolution. Proc Natl Acad Sci. 115:2600–2606.
- Nosek BA, Lakens D. 2014. Registered reports: a method to increase the credibility of published results. Social Psychol. 45(3):137–141.
- Obels P, Lakens D, Coles NA, Gottfried J, Green SA. 2020. Analysis of open data and computational reproducibility in registered reports in psychology. Advan Methods Practices Psychol Sci. 3(2):229–237.
- Open Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science. 349:aac4716.
- Pennington CR, Jones A, Bartlett JE, Copeland A, Shaw DJ. 2021. Raising the bar: Improving methodological rigour in cognitive alcohol research. Addiction. Advanced online publication. https://doi.org/https://doi.org/10.1111/add.15563
- Scheel AM, Schijen MRM J, Lakens D. 2021. An excess of positive results: comparing the standard psychology literature with registered reports. Advan Methods Practices Psychol Sci. 4(2):1–12.
- Simmons JP, Nelson LD, Simonsohn U. 2011. False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol Sci. 22:1359–1366.
- Soderberg CK, Errington TM, Schiavone SR, Bottesini JG, Singleton Thorn F, Vazire S, Nosek BA. 2020. Initial evidence of research quality of Registered Reports compared to the traditional publishing model. https://doi.org/https://doi.org/10.31222/osf.io/7x9vy
- Stewart SLK, Rinke EM, McGarrigle R, Lynott D, Lautarescu A, Galizzi M, Farren EK, Crook Z. 2020. Pre-registration and Registered Reports: a primer from UKRN. https://doi.org/https://doi.org/10.31219/osf.io/8v2n7
- Vassar M, Roberts W, Cooper CM, Wayant C, Bibens M. 2019. Evaluation of selective outcome reporting and trial registration practices among addiction clinical trials. Addiction. 115:1172–1179.