471
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Exploring the understanding of reproducibility among stakeholders within academia and their expectations for a web-based education tool: A qualitative study

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Received 31 Jan 2024, Accepted 09 Apr 2024, Published online: 05 May 2024

References

  • “AllTrials”. 2024. Main Page. https://www.alltrials.net/find-out-more/about-alltrials/.
  • ”ATLAS.ti”. 2024. Main Page. https://atlasti.com/.
  • Baker, M. 2016. “1,500 Scientists Lift the Lid on Reproducibility.” Nature 533 (7604): 452–454. https://doi.org/10.1038/533452a.
  • Barba, L. A. 2018. “Terminologies for Reproducible Research.” arXiv Preprint. https://doi.org/10.48550/ARXIV.1802.03311.
  • Begley, C. G., and P. A. I. John. 2015. “Reproducibility in Science: Improving the Standard for Basic and Preclinical Research.” Circulation Research 116 (1): 116–126. https://doi.org/10.1161/CIRCRESAHA.114.303819.
  • Begley, C. G., and M. E. Lee. 2012. “Raise Standards for Preclinical Cancer Research.” Nature 483 (7391): 531–533. https://doi.org/10.1038/483531a.
  • Bouter, L. M., J. Tijdink, N. Axelsen, B. C. Martinson, and G. Ter Riet. 2016. “Ranking Major and Minor Research Misbehaviors: Results from a Survey Among Participants of Four World Conferences on Research Integrity.” Research Integrity and Peer Review 1 (1): 17. https://doi.org/10.1186/s41073-016-0024-5.
  • Braun, V., and V. Clarke. 2006. “Using Thematic Analysis in Psychology.” Qualitative Research in Psychology 3 (2): 77–101. https://doi.org/10.1191/1478088706qp063oa.
  • Braun, V., and V. Clarke. 2021. “To Saturate or Not to Saturate? Questioning Data Saturation as a Useful Concept for Thematic Analysis and Sample-Size Rationales.” Qualitative Research in Sport, Exercise and Health 13 (2): 201–216. https://doi.org/10.1080/2159676X.2019.1704846.
  • Buljan, I., M. Franka Žuljević, N. Bralić, L. Ursić, and L. Puljak. 2023. Understanding of Concept of Reproducibility and Defining the Needs of the Stakeholders: A Qualitative Study. Open Science Framework. https://doi.org/10.17605/OSF.IO/H9RT7.
  • Button, K. S. 2018. “Reboot Undergraduate Courses for Reproducibility.” Nature 561 (7723): 287–287. https://doi.org/10.1038/d41586-018-06692-8.
  • Button, K. S., J. P. A. Ioannidis, C. Mokrysz, B. A. Nosek, J. Flint, E. S. J. Robinson, and M. R. Munafò. 2013. “Power Failure: Why Small Sample Size Undermines the Reliability of Neuroscience.” Nature Reviews Neuroscience 14 (5): 365–376. https://doi.org/10.1038/nrn3475.
  • Camerer, C. F., A. Dreber, F. Holzmeister, T.-H. Ho, J. Huber, M. Johannesson, M. Kirchler, G. Nave, B. A. Nosek, T. Pfeiffer, et al. 2018. “Evaluating the Replicability of Social Science Experiments in Nature and Science Between 2010 and 2015.” Nature Human Behaviour 2 (9): 637–644. https://doi.org/10.1038/s41562-018-0399-z.
  • Cole, N. L., S. Reichmann, and T. Ross-Hellauer. 2023. “Toward Equitable Open Research: Stakeholder Co-Created Recommendations for Research Institutions, Funders and Researchers.” Royal Society Open Science 10 (2): 221460. https://doi.org/10.1098/rsos.221460.
  • “Croatian Reproducibility Network”. 2023. Main Page. https://crorin.hr/.
  • Da Silva, T., and A. Jaime. 2015. “Negative Results: Negative Perceptions Limit Their Potential for Increasing Reproducibility.” Journal of Negative Results in BioMedicine 14 (1): 12. https://doi.org/10.1186/s12952-015-0033-9.
  • Diaba-Nuhoho, P., and M. Amponsah-Offeh. 2021. “Reproducibility and Research Integrity: The Role of Scientists and Institutions.” BMC Research Notes 14 (1): 451. https://doi.org/10.1186/s13104-021-05875-3.
  • Dudda, L. A., M. Kozula, T. Ross-Hellauer, E. Kormann, R. Spijker, N. DeVito, G. Gopalakrishna, V. Van den Eynden, P. Onghena, F. Naudet, et al. 2023. “Scoping Review and Evidence Mapping of Interventions Aimed at Improving Reproducible and Replicable Science: Protocol.” Open Research Europe 3 (October): 179. https://doi.org/10.12688/openreseurope.16567.1.
  • “Embassy of Good Science”. 2024. Main Page. https://embassy.science/wiki/Main_Page.
  • Errington, T. M., E. Iorns, W. Gunn, F. Elisabeth Tan, J. Lomax, and B. A. Nosek. 2014. “An Open Investigation of the Reproducibility of Cancer Biology Research.” eLife 3 (December): e04333. https://doi.org/10.7554/eLife.04333.
  • Evans, N., M. Van Hoof, L. Hartman, A. Marusic, B. Gordijn, K. Dierickx, L. Bouter, and G. Widdershoven. 2021. “EnTIRE: Mapping Normative Frameworks for EThics and Integrity of REsearch.” Research Ideas and Outcomes 7 (November): e76240. https://doi.org/10.3897/rio.7.e76240.
  • Ferguson, J., R. Littman, G. Christensen, E. Levy Paluck, N. Swanson, Z. Wang, E. Miguel, D. Birke, and J.-H. Pezzuto. 2023. “Survey of Open Science Practices and Attitudes in the Social Sciences.” Nature Communications 14 (1): 5401. https://doi.org/10.1038/s41467-023-41111-1.
  • Freedman, L. P., I. M. Cockburn, and T. S. Simcoe. 2015. “The Economics of Reproducibility in Preclinical Research.” PLOS Biology 13 (6): e1002165. https://doi.org/10.1371/journal.pbio.1002165.
  • Gabelica, M., R. Bojčić, and L. Puljak. 2022. “Many Researchers Were Not Compliant with Their Published Data Sharing Statement: A Mixed-Methods Study.” Journal of Clinical Epidemiology 150 (October): 33–41. https://doi.org/10.1016/j.jclinepi.2022.05.019.
  • Goodman, S. N., D. Fanelli, and J. P. A. Ioannidis. 2016. “What Does Research Reproducibility Mean?” Science Translational Medicine 8 (341). https://doi.org/10.1126/scitranslmed.aaf5027.
  • Gould, E., H. Fraser, T. Parker, S. Nakagawa, S. Griffith, P. Vesk, F. Fidler, et al. 2023. “Same Data, Different Analysts: Variation in Effect Sizes Due to Analytical Decisions in Ecology and Evolutionary Biology.” Ecology and Evolutionary Biology. https://doi.org/10.32942/X2GG62.
  • Gundersen, O. E. 2021. “The Fundamental Principles of Reproducibility.” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379 (2197): 20200210. https://doi.org/10.1098/rsta.2020.0210.
  • Hardwicke, T. E., J. D. Wallach, M. C. Kidwell, T. Bendixen, S. Crüwell, and J. P. A. Ioannidis. 2020. “An Empirical Assessment of Transparency and Reproducibility-Related Research Practices in the Social Sciences (2014–2017).” Royal Society Open Science 7 (2): 190806. https://doi.org/10.1098/rsos.190806.
  • Haven, T., G. Gopalakrishna, J. Tijdink, D. Van Der Schot, and L. Bouter. 2022. “Promoting Trust in Research and Researchers: How Open Science and Research Integrity Are Intertwined.” BMC Research Notes 15 (1): 302. https://doi.org/10.1186/s13104-022-06169-y.
  • Ioannidis, J. P. A. 2005. “Why Most Published Research Findings Are False.” PLOS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124.
  • Kaiser, J., and J. Brainard. 2023. “Ready, Set, Share!” Science 379 (6630): 322–325. https://doi.org/10.1126/science.adg8142.
  • Kamberelis, G., and G. Dimitriadis. 2005. “Focus Groups: Strategic Articulations of Pedagogy, Politics, and Inquiry.” The SAGE Handbook of Qualitative Research 3 (January): 887–907.
  • Kohrs, F. E., S. Auer, A. Bannach-Brown, S. Fiedler, T. Laura Haven, V. Heise, C. Holman, et al. 2023. Eleven Strategies for Making Reproducible Research and Open Science Training the Norm at Research Institutions. Open Science Framework. https://doi.org/10.31219/osf.io/kcvra.
  • Labib, K., N. Evans, R. Roje, P. Kavouras, A. Reyes Elizondo, W. Kaltenbrunner, I. Buljan, T. Ravn, G. Widdershoven, L. Bouter, et al. 2022. “Education and Training Policies for Research Integrity: Insights from a Focus Group Study.” Science & Public Policy 49 (2): 246–266. https://doi.org/10.1093/scipol/scab077.
  • Lash, T. L. 2017. “The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.” American Journal of Epidemiology 186 (6): 627–635. https://doi.org/10.1093/aje/kwx261.
  • Macleod, M., and the University of Edinburgh Research Strategy Group. 2022. “Improving the Reproducibility and Integrity of Research: What Can Different Stakeholders Contribute?” BMC Research Notes 15 (1): 146. https://doi.org/10.1186/s13104-022-06030-2.
  • Miyakawa, T. 2020. “No Raw Data, No Science: Another Possible Source of the Reproducibility Crisis.” Molecular Brain 13 (1): 24. https://doi.org/10.1186/s13041-020-0552-2.
  • Munafò, M. R., C. D. Chambers, A. M. Collins, L. Fortunato, and M. R. Macleod. 2020. “Research Culture and Reproducibility.” Trends in Cognitive Sciences 24 (2): 91–93. https://doi.org/10.1016/j.tics.2019.12.002.
  • Munafò, M. R., B. A. Nosek, D. V. M. Bishop, K. S. Button, C. D. Chambers, N. Percie Du Sert, U. Simonsohn, E.-J. Wagenmakers, J. J. Ware, and J. P. A. Ioannidis. 2017. “A Manifesto for Reproducible Science.” Nature Human Behaviour 1 (1): 0021. https://doi.org/10.1038/s41562-016-0021.
  • National Academies of Sciences, Engineering, and Medicine. 2019a. “Improving Reproducibility and Replicability.” Reproducibility and Replicability in Science. Vol. 25303. Washington, D.C.: National Academies Press. https://doi.org/10.17226/25303.
  • National Academies of Sciences, Engineering, and Medicine. 2019b. Reproducibility and Replicability in Science. Washington, D.C: National Academies Press. https://doi.org/10.17226/25303.
  • National Academies of Sciences, Engineering, and Medicine, Policy and Global Affairs, Committee on Science, Engineering, Medicine, and Public Policy, and Committee on Responsible Science. 2017a. “Context and Definitions”. In Fostering Integrity in Research. Consensus Study Report, Washington, DC: The National Academies Press (US). https://www.ncbi.nlm.nih.gov/books/NBK475954/.
  • National Academies of Sciences, Engineering, and Medicine, Policy and Global Affairs, Committee on Science, Engineering, Medicine, and Public Policy, and Committee on Responsible Science. 2017b. “Identifying and Promoting Best Practices for Research Integrity”. In Fostering Integrity in Research. Consensus Study Report, Washington, DC: The National Academies Press (US). https://www.ncbi.nlm.nih.gov/books/NBK475945/.
  • National Academies of Sciences, Engineering, and Medicine, Policy and Global Affairs, Committee on Science, Engineering, Medicine, and Public Policy, and Committee on Responsible Science. 2017c. “Incidence and Consequences”. In Fostering Integrity in Research. Consensus Study Report, Washington, DC: The National Academies Press (US) https://www.ncbi.nlm.nih.gov/books/NBK475945/.
  • National Council for Science. 2009. ‘Pravilnik o Znanstvenim i Umjetničkim Područjima, Poljima i Granama - NN 118/2009’. https://narodne-novine.nn.hr/clanci/sluzbeni/2009_09_118_2929.html.
  • National Institutes of Health. 2023. ‘Enhancing Reproducibility Through Rigor and Transparency’. https://grants.nih.gov/policy/reproducibility/index.htm.
  • Nelson, K. I., J. Chung, M. M. Malik, and M. M. Malik. 2021. “Mapping the Discursive Dimensions of the Reproducibility Crisis: A Mixed Methods Analysis.” PLOS ONE 16 (7): e0254090. https://doi.org/10.1371/journal.pone.0254090.
  • Nelson, G. M., and D. L. Eggett. 2017. “Citations, Mandates, and Money: Author Motivations to Publish in Chemistry Hybrid Open Access Journals.” Journal of the Association for Information Science and Technology 68 (10): 2501–2510. https://doi.org/10.1002/asi.23897.
  • “Netherlands Research Integrity Network”. 2024. Main Page. https://nrin.nl/
  • “Null Hypothesis Initiative”. 2024. Main Page. https://nullhypothesis.com/about.php.
  • “OpenAI”. 2024. How to Find a Trusworthy Repository for Your Data. https://chat.openai.com/g/g-HMNcP6w7d-data-analyst.
  • “OpenAIRE”. 2024. How to Find a Trusworthy Repository for Your Data. https://www.openaire.eu/find-trustworthy-data-repository.
  • Open Science Collaboration. 2015. “Estimating the Reproducibility of Psychological Science.” Science 349 (6251): aac4716. https://doi.org/10.1126/science.aac4716.
  • “Open Science Netherlands”. 2024. Landing Page. https://www.openscience.nl/en.
  • Parsons, S., F. Azevedo, M. M. Elsherif, S. Guay, O. N. Shahim, G. H. Govaart, E. Norris, A. O’Mahony, A. J. Parker, A. Todorovic, et al. 2022. “A Community-Sourced Glossary of Open Scholarship Terms.” Nature Human Behaviour 6 (3): 312–318. https://doi.org/10.1038/s41562-021-01269-4.
  • Pawel, S., R. Heyard, C. Micheloud, and L. Held. 2023. Replication of “Null Results” – Absence of Evidence or Evidence of Absence? Preprint. elife. https://doi.org/10.7554/eLife.92311.1.
  • Plesser, H. E. 2018. “Reproducibility Vs. Replicability: A Brief History of a Confused Terminology.” Frontiers in Neuroinformatics 11 (January): 76. https://doi.org/10.3389/fninf.2017.00076.
  • Pusztai, L., C. Hatzis, and F. Andre. 2013. “Reproducibility of Research and Preclinical Validation: Problems and Solutions.” Nature Reviews Clinical Oncology 10 (12): 720–724. https://doi.org/10.1038/nrclinonc.2013.171.
  • “ReproducibiliTea”. 2024. ReproducibiliTea Podcast. https://soundcloud.com/reproducibiliteas.
  • “ReproducibiliTeach”. 2024. ReproducibiliTeach YouTube Channel. https://www.youtube.com/channel/UCZR3nicxfGYQbGbcf2krA7g.
  • Resnik, D. B., and A. E. Shamoo. 2017. “Reproducibility and Research Integrity.” Accountability in Research 24 (2): 116–123. https://doi.org/10.1080/08989621.2016.1257387.
  • Roje, R., A. Reyes Elizondo, W. Kaltenbrunner, I. Buljan, and A. Marušić. 2023. “Factors Influencing the Promotion and Implementation of Research Integrity in Research Performing and Research Funding Organizations: A Scoping Review.” Accountability in Research 30 (8): 633–671. https://doi.org/10.1080/08989621.2022.2073819.
  • Romero, F. 2019. “Philosophy of Science and the Replicability Crisis.” Philosophy Compass 14 (11): e12633. https://doi.org/10.1111/phc3.12633.
  • Stewart, S. L. K., C. R. Pennington, G. R. Da Silva, N. Ballou, J. Butler, Z. Dienes, C. Jay, S. Rossit, A. Samara, and U. K. Reproducibility Network (UKRN) Local Network Leads. 2022. “Reforms to Improve Reproducibility and Quality Must Be Coordinated Across the Research Ecosystem: The View from the UKRN Local Network Leads.” BMC Research Notes 15 (1): 58. https://doi.org/10.1186/s13104-022-05949-w.
  • Stieglitz, S., W. Konstantin, M. Milad, H. Lennart, B. Bela, L. Ania, and R. Stephanie. 2020. “When Are Researchers Willing to Share Their Data? – Impacts of Values and Uncertainty on Open Data in Academia.” PLOS ONE 15 (7): e0234172. https://doi.org/10.1371/journal.pone.0234172.
  • Sven, U., and J. W. Schneider. 2023. Knowledge Production Modes: The Relevance and Feasibility of ‘Reproducibility’. MetaArXiv. https://doi.org/10.31222/osf.io/ujnd9.
  • Toelch, U., and D. Ostwald. 2018. “Digital Open Science—Teaching Digital Tools for Reproducible and Transparent Research.” PLOS Biology 16 (7): e2006022. https://doi.org/10.1371/journal.pbio.2006022.
  • Tong, A., P. Sainsbury, and J. Craig. 2007. “Consolidated Criteria for Reporting Qualitative Research (COREQ): A 32-Item Checklist for Interviews and Focus Groups.” International Journal for Quality in Health Care 19 (6): 349–357. https://doi.org/10.1093/intqhc/mzm042.
  • “United Kingdom Reproducibility Network”. 2023. Main Page. https://www.ukrn.org/.
  • “United Kingdom Reproducibility Network”. 2024. UK Reproducibility Network Terms of Reference, version 4.4. Open Science Framework. https://osf.io/dnhqp.
  • Youyou, W., Y. Yang, and B. Uzzi. 2023. “A Discipline-Wide Investigation of the Replicability of Psychology Papers Over the Past Two Decades.” Proceedings of the National Academy of Sciences 120 (6): e2208863120. https://doi.org/10.1073/pnas.2208863120.
  • Zuiderwijk, A., R. Shinde, W. Jeng, and F. Sudzina. 2020. “What Drives and Inhibits Researchers to Share and Use Open Research Data? A Systematic Literature Review to Analyze Factors Influencing Open Research Data Adoption.” PLOS ONE 15 (9): e0239283. https://doi.org/10.1371/journal.pone.0239283.