938
Views
0
CrossRef citations to date
0
Altmetric
EDUCATIONAL PSYCHOLOGY

UK Psychology PhD researchers’ knowledge, perceptions, and experiences of open scienceOpen DataOpen Materials

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon &
Article: 2248765 | Received 30 Jan 2023, Accepted 11 Aug 2023, Published online: 25 Aug 2023

Abstract

To advance the goals and values of open science, it is vital that the next generation of researchers, i.e. PhD researchers, is supported in adopting open science practices. However, to date, there is no comprehensive understanding of psychology PhD researchers’ knowledge, perceptions, and experiences with open science in a UK context. The present study used a pre-registered mixed methods design to fill this gap in the literature, by surveying psychology PhD students in the UK (n = 196) on their experiences with open science, perceptions of open science, and knowledge of open science tools and practices. Our findings demonstrate that while attitudes towards questionable research practices were consistently high, knowledge and perceptions of open science tools and practices varied considerably across PhD researchers. In particular, supervisory support and guidance with open science practices was mixed across participants. Perceived benefits of engaging with open science included benefits to employability, signalling researcher credibility, sharing learning and resources, building collaboration and relationships, and wider dissemination of PhD researchers’ work. Perceived barriers included lack of time, financial reasons, fear of scooping, fear of judgement or criticism, and incompatibility with research paradigms (e.g. qualitative research). Implications for policy, including British Psychological Society training and support, are discussed.

1.

Open science, open scholarship, or open research broadly refers to the movement to improve science’s replicability, reliability, reproducibility, transparency, and robustness of science (Munafò et al., Citation2017; Nosek et al., Citation2015). The phrase “open science” is an umbrella term which encompasses related concepts, such as open data, open access publishing, and open pedagogy. It has received attention in the past several years, following concerns about failed replications and lack of reproducibility in psychological research (Azevedo et al., Citation2022). Open science tools and practices aim to improve the trust in psychological data (e.g., Pashler & Wagenmakers, Citation2012), reduce questionable research practices (Fiedler & Schwarz, Citation2016; John et al., Citation2012), and identify cases of scientific misconduct (Callaway, Citation2011). Researchers have proposed ways to increase uptake of open science practices; Norris and O’Connor (Citation2019), for example, apply a behaviour change model to consider the uptake of open science behaviours.

Early Career Researchers (ECRs), including PhD researchers, graduate students, and early-career faculty, represent the incoming generation of scientists. If the future of science is to continue the momentum that the open science conversation has started, ECRs should be equipped to engage with open science practices and principles. Already, research has evidenced that ECRs have responded to open science discussions positively and proactively, and ECRs have contributed much to the movement to improve psychological science (e.g., Bartlett & Eaves, Citation2019; Farnham et al., Citation2017; Hobson, Citation2019; Orben, Citation2019). Previous empirical literature has also investigated ECRs’ open science practices and attitudes in local contexts. For example, Toribio-Flórez et al. (Citation2021) investigated the open science perspectives and practices of ECRs based on the Max Planck Society in Germany. They concluded that ECRs demonstrated good knowledge and positive attitudes towards open science. However, other studies have identified that while ECRs are generally positive towards the goals of open science, and actively endorse open science practices such as data sharing (Campbell et al., Citation2019), some also endorse questionable research practices (in Germany; Stürmer et al., Citation2017).

Open science offers many practical benefits to ECRs. Farnham et al. (Citation2017) reported that early career researchers are generally receptive to an open science framework and note certain practical benefits. For example, open access articles often have higher citation rates (Hitchcock, Citation2004), open practices such as preprints can be useful for visibility (McKiernan et al., Citation2016), and can improve research efficacy (see also Arza & Fressoli, Citation2017). Similarly, open data practices can bolster reproducibility (Obels et al., Citation2020) allow ECRs to scrutinise the evidence more thoroughly (Allen & Mehler, Citation2019), and promote collaboration (e.g., Moshontz et al., Citation2018).

However, there are also unique barriers to open science implementation among early career researchers that have also been identified. For example, Everett and Earp (Citation2015) explain how the incentive structures of academia are misaligned with a drive for research to be slow (or “slow science” as has been advocated for by open scientists), which makes it difficult for ECRs to engage. The authors also propose that a focus on replication studies, instead of testing novel theories and research questions, may allow for a more meaningful embedding of open science principles into graduate programmes in psychology. Similarly, Pownall et al. (Citation2021) identify other barriers, such as the invisible labour involved in engaging with open science, epistemological incompatibilities between qualitative research and open science, and abrasive or unwelcoming cultures of “bropen science” (see Whitaker & Guest, Citation2020) that discourage ECRs to engage. Similarly, as Nicholas et al. (Citation2019) explain, “it is only ECRs who have gone on to obtain a secure or tenured position who can really afford the time to practise open science.” Therefore, it is useful to consider how early-career researchers in psychology, a field that has seen much progress in open science practices, perceive the “open science” movement.

So far, a plethora of commentaries and perspective pieces have investigated the benefits of engagement in open science practices, including for ECRs (Allen & Mehler, Citation2019). Similarly, there have been useful commentaries and guidelines for helping ECRs to engage with open science; for example, Kathawalla et al. (Citation2021) provide graduate students with a “roadmap” to engage in open science practice, and Kowalczyk et al. (Citation2022) provide recommendations for senior academics supporting early-career researchers with open science. Furthermore, some recent research has also investigated the attitudes and practices of ECRs in contexts such as Germany (Stürmer et al., Citation2017) and France (Schöpfel et al., Citation2020). However, relatively little is known about the contemporary perspectives of UK-based PhD students in psychology.

2. The present study

The present study aimed to investigate UK psychology PhD researcher’s perspectives on open science, including perceived barriers and benefits of engagement with open science, experiences with open science, supervisory attitudes towards open science, and perceptions of questionable research practices. This pre-registered study is exploratory and descriptive, aiming to understand (1) the state of PhD researcher’s understanding of open science in the UK and (2) contemporary experiences of navigating open science whilst conducting a PhD in psychology in the UK.

3. Method

3.1. Participants and design

The study used a cross-sectional, self-report online survey consisting of open and closed questions, delivered via Qualtrics. We received 218 survey responses initially. After removing participants who did not consent (n = 4) empty responses (n = 17) and one duplicate, the sample consisted of 196 UK-based psychology PhD researchers (Mage = 30.47, SD = 7.81; Female = 143, Male = 50, non-binary = 1, prefer not to say = 2). Ethical approval for this study was granted by the Sciences and Technology Cross-Schools Research Ethics Committee at the University of Sussex (Reference: ER/JLT26/9).

3.1.1. Recruitment strategy

We aimed to recruit as diverse a sample as possible, especially to avoid a sample biased towards open science advocacy. Therefore, as well as using social media, we contacted all psychology departments in UK universities to ask for the survey URL to be distributed to their PhD students. To achieve this, we made a list of all UK institutions that offer PhDs in psychology, found contact details for psychology departments, and emailed them. Overall, 107 departments were contacted. To further facilitate recruitment, participants could voluntarily enter a prize draw with a chance of winning a £25 voucher in exchange for participation.

3.2. Procedure and measures

After providing detailed informed consent then demographic information, participants were asked to complete a series of online questions. Study materials and data for this study can be openly accessed here: https://osf.io/5x3z9/

3.2.1. Understanding of open science

We first asked whether participants had “any understanding” of Open Science on a 1 (“I have heard of it and have a good understanding”) to 4 (“I have never heard of it and don’t know anything about it”) scale. This allowed us to tailor the following questions to the participant’s level of understanding, with one stream of questions for participants who have never heard of open science, and another stream for participants with any level of familiarity with the concept. Participants were then asked to situate their research on a scale from 0 (qualitative only) to 100 (quantitative only). Note that all questions were optional and could be skipped if preferred, aside from the question about having any experience of open science (required to direct participants to the right series of questions).

Participants who identified themselves as having “some” familiarity with open science were first asked to share the first three words that they associate with the term “open science”, before listing any open science practices that they are familiar with via free-text response boxes. These participants were also asked to share any knowledge that they have of the background of the open science movement, or why it has become more prominent in recent years. There was a free-text box for additional comments.

3.2.2. Understanding of open science practices

Participants were then provided with a list of nine open science practices (e.g., replications, open data, pre-registration, preprints) and five statistics practices (e.g., Bayesian statistics, effect sizes, power analyses) and were asked to tick all that they have engaged with. Participants were then asked when they first learnt about open science from a pre-set list (e.g., “during my undergraduate degree” or “during my Masters degree”) and from which source (e.g., “on a taught module” or “via social media”) with space for “other” free-text responses.

3.2.3. Confidence explaining open science concepts

Participants were then asked to self-rate their confidence in explaining concepts that are often associated with the open science label, such as “publication bias” and “replication crisis”. Confidence was measured using a 1–7 sliding scale from “not confident at all” to “very confident”.

3.2.4. Benefits and barriers of open science

Participants were then asked to report any perceived benefits and barriers that open science may bring to PhD researchers, even if they have not experienced them personally. They were invited to share views using two free-text boxes.

3.2.5. Supervisor engagement with open science

Participants were also asked about their supervisors’ engagement with and attitudes towards open science, measured using a single-item scale from 1 (open science practices are mandatory) to 6 (open science practices are prevented), and were provided with a free-text box to explain their response if they wished to do so.

3.2.6. Perception of questionable research practices

Perceptions of significant and null findings were measured on a 1–5 scale from “strongly disagree” to “strongly agree”, using statements such as “If someone does good scientific work, they produce significant results” (a measure taken from Krishna et al., Citation2018). Participants were also asked to indicate how problematic they viewed 11 QRPs to be, including “selectively reporting studies” and “falsifying data” on a 1–5 scale from “sensible” to problematic’, again taken from Krishna et al. (Citation2018). All participants were then taken to the closing page of the survey.

3.3. Analytical approach

Our analytical approach was pre-registered on the Open Science Framework: https://osf.io/5x3z9/. All analyses for this project were exploratory, taking the form of descriptive quantitative analysis, or basic content analysis (Drisko & Maschi, Citation2016). In this study, each response to each question was treated as an individual meaning unit. All qualitative content analyses were conducted by (EC) and was subsequently reviewed by a second member of the authorship team (with review work shared equally across all four remaining authors). Minimal disagreements emerged, with fewer than 5% of responses subsequently re-categorised following discussions between (EC) and each reviewer. Analyses were primarily inductive (bottom-up), deriving codes and subsequent categories from the data. Quantitative analyses were computed using R 4.1.2 (R Core Team (Citation2021) using the packages dplyr 1.0.9 (Wickham et al., Citation2022), ggplot2 3.3.6 (Wickham, Citation2016) and magrittr 2.0.3 (Bache & Wickham, Citation2022).

3.3.1. A note on positionality

It is important to note that the research team comprises five early-career researchers in psychology, who all critically advocate for open science in different arenas. We are thus very much “part of the praxis” that we are researching, and this will largely guide how we interpret and relate to the data (see Jamieson et al., Citation2023). At the time of the study, all authors were at various stages of our PhDs and, therefore, have a close connection to the population of interest in this study. While our views and experiences of “open science” spaces, tools, and discourses vary between authors, we are all generally sympathetic with the goal of the open science movement to improve psychological science. We approached the data with varying levels of criticality regarding how early-career researchers may (not) be supported within open science conversations. As we engaged in the analysis and extracted recommendations from participants’ responses, we were conscious of ensuring that our approach of advocating for open science education is flexible and inclusive. That is, we are aware that open science advocacy can often come from one specific epistemological and theoretical position, and we are aware that not all researchers can (or indeed should) engage with all open science practices. Therefore, while we share recommendations here, our positionality means that we do not view these as prescriptive or exhaustive. We aimed to engage reflexively with the analysis process, in order to collectively “check in” with our assumptions throughout the data analysis and interpretation stage and ensure we were accurately portraying the data shared by participants. This also guided our decision to use content analysis, focusing on the manifest content shared by participants, instead of going into more depth to construct latent themes.

4. Results

More than three-quarters of the sample (79.59%) were White British, and 63 (32.14%) participants were in the first year of study, 42 (21.43%) in second year, 46 (23.47%) in third year, 28 (14.29%) in fourth year, 14 (7.14%) in “other” and two participants did not answer this question. Participants were well spread throughout subfields of psychology (including, for example, cognitive n = 43, health, n = 31, social n = 34; and developmental psychology n = 13). Thirty-three participants were identified as being mostly qualitative researchers (categorised by being in the 1st quartile of the 0–100 sliding scale; 17.55%), and 96 participants were mostly quantitative (48.98%). The rest were mixed methods (and sat within 50–80 on the scale). Note that all the data for this study can be openly accessed here: https://osf.io/5x3z9/

4.1. Understanding of open science

Familiarity with open science was quite mixed; just under half of the sample reported a “good understanding” of open science (n = 96; 49%), 60 participants had heard of open science but did not have a good understanding (n = 60; 30.6%) or any understanding (n = 18; 7.1%). Fourteen participants (7.1%) had never heard of open science. We then investigated participants’ confidence in explaining open science concepts, to see whether this related to their familiarity with open science. Figure shows the assortment of responses participants gave when asked to self-rate their confidence in explaining each of the listed concepts that related to open science. Overall, participants with a good understanding of open science consistently rated themselves as more confident at explaining each concept, compared to their counterparts with less or no knowledge of open science. When asked about where they had first learned about open science, the most popular response was within a taught postgraduate Master’s degree course (n = 38), followed by “Other” (n = 21), and taught by a supervisor during PhD (n = 17).

Figure 1. Caption. participant’s mean self-reported confidence in explaining open science concepts across four knowledge groups. Error bars represent 95% confidence intervals.

A graph which shows participant’s average confidence with different open science practices, including study pre-registration and publication bias. Different lines with error bars represent participants with different understanding of open science (from “a good understanding” to “never heard of it”. Bars are demarcated in different colours, in shades of purple and blue.
Figure 1. Caption. participant’s mean self-reported confidence in explaining open science concepts across four knowledge groups. Error bars represent 95% confidence intervals.

We then investigated the qualitative responses to the “knowledge of open science” section, in which participants were asked to share the first three words that they associate with the term “open science”. Note that we did not code positivity/negativity of words shared by participants, because the majority of responses were descriptive (e.g., “dissemination”, “preprint”, “open source”); however a small selection of the responses were explicitly negative words (e.g., “tedious”, “restrictive”, “difficult”, and “lots of work”). In the data, words typically centred around “transparency” (n = 56) and “accessibility” (n = 28), sharing (e.g. sharing data, materials or knowledge, n = 19), and research reproducibility (n = 21) and replicability (n = 26), with open science, often described as “good” or “better” science. Figure displays the words shared by participants as a word cloud, where words are larger if they were used more often. Several participants mentioned the “replication” and “questionable research practices” in their selection of three words, which was mirrored in the subsequent responses given to the question asking about the history of the open science movement.

Figure 2. Caption. word cloud illustrating the words describing open science that participants shared.

A word cloud, with text in dark blue and grey, whereby bigger words indicate high frequency. This shows that words such as “transparent” and “accessible” and “replicable” are among the most frequent words.
Figure 2. Caption. word cloud illustrating the words describing open science that participants shared.

In response to the free text box which accompanied questions about participant’s perception of the background of the open science movement and its prominence, participants most frequently highlighted the replication crisis as the key reason that there is now more attention paid to open science (n = 80), with many also more specifically mentioning questionable research practices (n = 28), publication bias (n = 16) and high-profile fraud cases such as Stapel (n = 11). Other comments described the “publish and perish” culture of academia, the struggles many researchers have with correctly using statistics, the historical trend of using predominantly WEIRD samples, and an overall lack of rigour and transparency as generating interest in open science.

In the free-text box within this section, 15 participants provided extra, more general comments about their perceptions of open science. The focus of these comments typically centred around participants’ views of open science as a concept. For example, one participant used this space to advocate for kindness and inclusivity within open science spaces (which was later echoed in the “barriers” question):

Open science is a great thing, but more needs to be done to ensure that it is practiced with kindness and inclusivity. There are people online who are horrible to researchers (under the guise of open science) who may have just made honest or trivial mistakes. This could be a serious barrier to more people practicing open science in the future - open science is all about being open to the idea that you may have made a mistake in your research, we shouldn’t punish people when it turns out they have made a mistake.

Other participants used this space to share views on the mentoring and support available to PhD researchers in psychology (e.g., “I actually think the BPS should make a pre-reg or replication or something mandatory for at least one practical for degrees to be accredited by the BPS.)”. Others used this comment box to recommend additional sources of learning about open science (e.g., “RIOT Science is doing a good job”).

4.2. Open science experiences

Participants had highly varied experiences with different open-science behaviours, as displayed in Table . Most commonly, participants have engaged in open access publishing and sharing data, closely followed by pre-registration and registered reports.

Table 1. Breakdown of open science activities that participants had engaged with

4.3. Supervisor engagement with open science

Experiences with supervisors were varied, from mandatory open science policies through to supervisors who actively avoid open science. Ninety-two (46.94%) of participants stated that open science practices are either encouraged or mandatory, and 54 (27.55%) stated that supervisors are neutral. Fifty (25.51%) participants stated that their supervisors either dissuade open science practices or their supervisors disagree with open science.

To understand this more, we explored the textual data that accompanied this question. For one participant, open science practices is simply the norm in their supervision: “my supervisor operates as if there is no other option than open science so I was unaware other people aren’t taking that same steps as me”, while another reported having the opposite experience: “my supervisor will only consider publication in 3 closed journals and does not wish to discuss open science as a concept or how to integrate it in our work”. Many supervisors sat firmly in the middle, being generally encouraging but not actively engaging themselves, or lacking the knowledge and experiences to support their students with open science behaviours in their PhD work. Several participants also indicated that their supervisors are fearful of being scooped if data or ideas are shared openly. Furthermore, participants also commented negatively about their supervisors and open science, such as “my supervisor only does open science stuff because he thinks it will get him a promotion. He otherwise doesn’t know anything about it or half arses it. I think he’s using it as a vehicle for his own personal gain rather than the gain of his work or the scientific community” and the more concerning “my main supervisor does a few dodgy data things which is why, in principle, she is open to it (but just not for her data/research!)”.

4.4. Perceived benefits and costs of open science

Using a content analysis approach, we then analysed the responses to the perceived benefits and costs of open science to PhD researchers.

4.4.1. Perceived benefits

Our content analysis identified six core categories of benefits in engaging with open science. These were: (1) employability (n = 29), (2) researcher credibility (n = 56), (3) learning and resources (n = 18), (4) collaboration and relationships (n = 10), (5) dissemination and accessibility (n = 26), and (6) other benefits (n = 6). These categories were not mutually exclusive, so some responses shared multiple codes. Nineteen participants did not give textual answers to this question or responded that they did not know enough about open science to comment on benefits (e.g., “I’m still not 100% sure what open science is so cannot really answer the question.”). Of the responses, 29 participants acknowledged that their attitudes are driven by “employability” concerns after their PhD (e.g., “It is becoming a good skill to demonstrate on your CV”). Others had more personal, value-driven motivations, including their desire to demonstrate “researcher credibility” (n = 56); for example, participants spoke of “getting into good habits” and “starting as you mean to go on”. For other participants (n = 18), the value of open science tools to “learn and share resources”, often with other early-career researchers, was perceived as a benefit (e.g., “allows sharing of data, methods, resources.”). Similarly, some participants (n = 10) explicitly discussed the capacity for open science to create and strengthen opportunities for collaboration and relationships with other researchers (e.g., “your work will be respected, people will want to work with you”). Also in the data was a common perception that engaging in open science can aid accessibility and dissemination of one’s research (n = 26). For example, participants spoke about how open science “gives more people access to your work” and “helps us share our work with the wider community regardless of the null effects”. Some of the responses categorised as “other” (n = 6) included issues such as confidence and improving workflow (e.g., “it also makes your life easier in the long run when coming back to old studies.”) and “Helps you gain confidence in your data (good for viva”).

4.4.2. Perceived barriers

Forty participants wrote “I don’t know” or similar answers to the question about barriers (e.g., “I cannot see any problems with engaging in open science for PhD students”). Among the participants who did respond, there were six core categories of barriers to participating in open science that participants reported. These were: (1) lack of time (n = 44), (2) financial reasons (n = 16), (3) fear of scooping (n = 8), (4) fear of judgement or criticism (n = 27) (5) incompatibility with research paradigm (n = 8), and (6) other (n = 4). Participants discussed “lack of time” (n = 44) as a common barrier to engaging in open science, and this was often related to specific tools and practices (e.g., “It all takes so long. It’s prohibitive to the short time frame of a phd. E.g. registered reports can take forever. Pre-registration is arduous if you’re not supported by a supervisor.”). Other participants discussed financial barriers (n = 16), which typically referred to publishing open access (e.g., “financial costs of submitting work to an open access journal”). For a smaller number of participants (n = 8), “fear of scooping” was an important barrier; for example, one participant described “the fear that as a PhD student, who doesn’t have the security of publications … that if someone replicates my research and published the findings before me, that my work then becomes irrelevant … I’m not at a secure stage in my career for this not to matter.”. Aligned with this, participants in the sample (n = 27) also discussed fear of their work being perceived as not credible, worthy, or important, that may be exposed by open science tools, in a way that might damage their career (e.g., “there can be a fear that your less-than-great work will be more ‘exposed’, even though you’ve since developed and grown as a researcher … It could be quite intimidating”).

Similarly, participants in this category explicitly spoke about fear of judgement and hostility that can come with increased openness (e.g., “you feel like you may be setting yourself up for scrutiny when you are not yet confident in your research ideas.”). Finally, some participants also spoke about how open science does not align with their research paradigm (n = 8); this was particularly common among qualitative researchers (e.g., “I am now focusing on qualitative methods, which don’t fit well with the main approaches e.g. pre-registering. I have seen some attempts to do this for qual research but it isn’t well developed”). A further four participants’ responses were coded as “other” and referred to aspects such as discipline-specific publishing concerns (e.g., “Some journals may not allow you to pre-register or pre-print your paper so this stops you being able to publish in these journals”).

4.5. Perception of questionable research practices

We then explored the descriptive responses to the perception of QRPs questions, starting with participant’s agreements with different statements (see Table ). Agreement with QRP statements was generally low but rose to the mid-point of the scale when related to the opinions of supervisors. This seems to suggest that whilst respondents understood that null effects were not indicative of poor science, supervisors often disagreed. Importantly, participants agreed most with the notion that supervisory feedback is more positive with significant results. This generally indicated an understanding of questionable practices among the sample.

Table 2. Average agreement with questionable research practices statements, where higher scores indicate more agreement with the item. Items are presented in order from least to most agreement

Finally, we explored the extent to which participants viewed different research practices (including QRPs) to be sensible or problematic. Table shows that participants overwhelmingly rated non-questionable practices as sensible, and items are ascending in perceived problematic nature. “Falsifying data” is among the most problematic, whereas “Rounding off p-values” was the QRP that was viewed as the least problematic.

Table 3. Average agreement with the problematic nature of questionable, and non-questionable research practices, where 1 is sensible and 5 is problematic. Practices are shown in ascending order

5. Discussion

The present study aimed to understand the experiences and perceptions of psychology PhD researchers in the UK with open science. Our findings demonstrate that PhD researchers generally have a good appreciation of the problematic nature of most questionable research practices, are engaging with some open science practices and tools, and have a generally solid familiarity with open science as a concept. Participants also identified some positive benefits of open science, which broadly discussed how engagement with open, reproducible, and transparent practices may confer benefits to PhD researchers’ careers, opportunity, and perceived research credibility. In this sense, open science may be an ally to early-career researchers (see Hobson, Citation2019), and thus engaging with transparent and robust practices may be useful in helping PhD researchers establish their credibility, enhance the rigour of research (which, in turn, bolsters trust in their scientific findings), and improve their capacity for larger-scale collaborations and opportunities.

However, some barriers were also identified, centring around both perceptions of negative impact of open science (e.g., time constraints, scooping) and a lack of supervisory engagement and support. Crucially, in line with this, our study has highlighted the vast variability in experiences of psychology PhD researchers with open science, specifically in the context of support from supervisors. This is problematic, given the capacity for variable experiences to create and exacerbate inequalities in experiences; for example, as engagement with open science becomes increasingly present in hiring and promotion guidance (see McKiernan et al., Citation2016), PhD researchers who do not have support will be disadvantaged (see also Pownall et al., Citation2021). Furthermore, participants’ free-text responses provided an opportunity for a more nuanced appreciation of how PhD researchers navigate the open science landscape. For example, issues, such as an incompatibility between open science and qualitative research, were cited as an important barrier that must be addressed, to ensure accessibility of open science itself (see Branney et al., Citation2023 for a discussion of qualitative steps to open science). Until open science is open to all research epistemologies and approaches, or, indeed, until there are more legitimate ways for researchers to “opt out” of practices not compatible, researchers on the margins will be further disadvantaged.

Taken together, our findings extend the existing literature which examines open science perceptions of practice use among psychology students in other countries. For example, in Canada, Moran et al. (Citation2023) found that students are generally aware of the problematic nature of QRPs but do engage with them in some instances. Similarly, Kvetnaya et al. (Citation2019) examined German psychology students’ use of QRPs and suggested that training about open science may combat the prevalence of QRPs in student work. Krishna et al. (Citation2018) also broadly support these findings and highlight the role of supervisor support in deterring students from using QRPs.

These findings provide the discipline with a contemporary understanding of PhD researchers’ understanding of, and engagement with open science. All participants in this study were studying in psychology departments and, therefore, this work can contribute meaningfully to the British Psychological Society’s (BPS) policy and strategic aims. In recent years, the BPS has led the way in terms of establishing open science as a research priority (e.g., in the context of COVID-19; O’Connor et al., Citation2020), facilitating uptake of pre-registration (Bosnjak et al., Citation2022), encouraging open data sharing (British Psychological Society, Citation2020), and advocating for Registered Reports (British Psychological Society, Citation2018). Our findings here suggest that more could be done to educate UK-based early career researchers with navigating the open science landscape. This may include, for example, top-down educational support from organisations such as UK Reproducibility Network (UKRN) and funders of PhD research, such as UK Research and Innovation (UKRI). This study highlights the urgent need for support, education, and guidance, to address the gaps and inequalities in experiences that PhD researchers reported. Therefore, integrating open science training at the earliest stage of research training in psychology (e.g., undergraduate psychology training) could facilitate a culture of open science which would make engagement in practices more supported. Indeed, psychological science is currently at a stage where open research practices are becoming increasingly ubiquitous so, regardless of one’s own opinions of the open science movement, future researchers would be at a disadvantage if they do not at least understand the current landscape. To equip early-career researchers, this may be achieved by integrating open science into the BPS accreditation standards (see Thibault et al., Citation2023 for a recent discussion of this) and the Quality Assurance Agency benchmark for psychology. Teaching should include open science history and background; tools and practices, as well as critiques of open science (e.g., Pownall et al., Citation2021). Overall, this is a crucial time in psychology’s ongoing reform, and it is time to prioritise the training, guidance, and support of the next generation of psychological scientists.

Open scholarship

This article has earned the Center for Open Science badges for Open Data, Open Materials and Preregistered. The data and materials are openly accessible at https://osf.io/5x3z9/, https://osf.io/5x3z9/ and https://osf.io/5jv8g

CRediT statement

Conceptualisation: JT, EC

Methodology: EC, JT, MP, MS, AJ

Formal analysis: EC, MS, JT, MP

Investigation: EC, JT, MP, MS, AJ

Data curation: JT, EC, MS

Writing—original draft: MP, EC

Writing—review and editing: MP, EC, JT, MS, AJ

Project administration: JT, MP, EC

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Study materials and data for this study can be openly accessed here: https://osf.io/5x3z9/

Additional information

Notes on contributors

Madeleine Pownall

Madeleine Pownall is a Lecturer at the School of Psychology. She is a Fellow of the Leeds Institute for Teaching Excellence.

Jenny Terry

Jenny Terry is a Lecturer in Psychological Methods and Doctoral Researcher in the School of Psychology at the University of Sussex.

Elizabeth Collins

Elizabeth Collins is an Associate Research Analyst at Clarivate.

Martina Sladekova

Martina Sladekova is a Lecturer in Psychological Research Methods and a Doctoral Researcher at University of Sussex.

Abigail Jones

Abigail Jones is an Assistant Lecturer in Psychology.

References

  • Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLoS Biology, 17(5), e3000246. https://doi.org/10.1371/journal.pbio.3000246
  • Arza, V., & Fressoli, M. (2017). Systematizing benefits of open science practices. Information Services & Use, 37(4), 463–13. https://doi.org/10.3233/ISU-170861
  • Azevedo, F., Liu, M., Pennington, C. R., Pownall, M., Evans, T. R., Parsons, S., & Westwood, S. J. (2022). Towards a culture of open scholarship: The role of pedagogical communities. BMC Research Notes, 15(1), 1–5. https://doi.org/10.1186/s13104-022-05944-1
  • Bache, S. M., & Wickham, H. (2022). magrittr: A Forward-pipe operator for R. R. package version 2.0.3. https://CRAN.R-project.org/package=magrittr
  • Bartlett, J., & Eaves, J. (2019). Getting to grips with open science. In H. Walton (Ed.), Guide for psychology postgraduates: Surviving postgraduate study (2nd ed., pp. 85–89). British Psychological Society.
  • Bosnjak, M., Fiebach, C. J., Mellor, D., Mueller, S., O’Connor, D. B., Oswald, F. L., & Sokol, R. I. (2022). A template for preregistration of quantitative research in psychology: Report of the joint psychological societies preregistration task force. American Psychologist, 77(4), 602. https://doi.org/10.1037/amp0000879
  • Branney, P., Brooks, J., Kilby, L., Newman, K. L., Norris, E., Pownall, M., Talbot, C. V., Treharne, G. J., & Whitaker, C. (2023). Three steps to open science for qualitative research in psychology. Social and Personality Psychology Compass, 17(4), e12728. https://doi.org/10.1111/spc3.12728
  • British Psychological Society. (2018). Registered reports. The Psychologist. https://www.bps.org.uk/psychologist/registered-reports
  • British Psychological Society. (2020). Position statement on open data. https://cms.bps.org.uk/sites/default/files/2022-06/Open%20data%20position%20statement.pdf
  • Callaway, E. (2011). Report finds massive fraud at Dutch universities. Nature, 479(7371), 15. https://doi.org/10.1038/479015a
  • Campbell, H. A., Micheli-Campbell, M. A., & Udyawer, V. (2019). Early career researchers embrace data sharing. Trends in Ecology & Evolution, 34(2), 95–98. https://doi.org/10.1016/j.tree.2018.11.010
  • Drisko, J. W., & Maschi, T. (2016). Content analysis. Pocket Guide to Social Work Research. https://doi.org/10.1093/acprof:oso/9780190215491.001.0001
  • Everett, J. A., & Earp, B. D. (2015). A tragedy of the (academic) commons: Interpreting the replication crisis in psychology as a social dilemma for early-career researchers. Frontiers in Psychology, 6, 1152. https://doi.org/10.3389/fpsyg.2015.01152
  • Farnham, A., Kurz, C., Öztürk, M. A., Solbiati, M., Myllyntaus, O., Meekes, J., Pham, T. M., Paz, C., Langiewicz, M., Andrews, S., Kanninen, L., Agbemabiese, C., Guler, A. T., Durieux, J., Jasim, S., Viessmann, O., Frattini, S., Yembergenova, D. & Hettne, K. (2017). Early career researchers want open science. Genome Biology, 18(1), 221–224. https://doi.org/10.1186/s13059-017-1351-7
  • Fiedler, K., & Schwarz, N. (2016). Questionable research practices revisited. Social Psychological and Personality Science, 7(1), 45–52. https://doi.org/10.1177/1948550615612150
  • Hitchcock, S. (2004). The effect of open access and downloads (‘hits’) on citation impact: A bibliography of studies. University of Southampton. http://eprints.soton.ac.uk/id/eprint/354006
  • Hobson, H. (2019). Registered reports are an ally to early career researchers. Nature Human Behaviour, 3(10), 1010. https://doi.org/10.1038/s41562-019-0701-8
  • Jamieson, M. K., Pownall, M., & Govaart, G. H. (2023). Reflexivity in quantitative research: A rationale and beginner’s guide. Social and Personality Psychology Compass, 17(4), e12735. https://doi.org/10.1111/spc3.12735
  • John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth-telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
  • Kathawalla, U. K., Silverstein, P., & Syed, M. (2021). Easing into open science: A guide for graduate students and their advisors. Collabra: Psychology, 7(1). https://doi.org/10.1525/collabra.18684
  • Kowalczyk, O. S., Lautarescu, A., Blok, E., Dall’aglio, L., & Westwood, S. J. (2022). What senior academics can do to support reproducible and open research: A short, three-step guide. BMC Research Notes, 15(1), 1–9. https://doi.org/10.1186/s13104-022-05999-0
  • Krishna, A., Peter, S. M., & Wicherts, J. M. (2018). Questionable research practices in student final theses–prevalence, attitudes, and the role of the supervisor’s perceived attitudes. PloS One, 13(8), e0203470. https://doi.org/10.1371/journal.pone.0203470
  • Kvetnaya, T., Frank, M., Brachem, J., Hill, M., Schramm, L. F. F., & Eiberger, A. (2019). Questionable research practices and open science in undergraduate empirical projects: Results from a nationwide survey amongst German psychology students open practices in education (OPINE). Proceedings of the Open Practices in Education, Frankfurt, Germany.
  • McKiernan, E. C. (2016). Point of view: How open science helps researchers succeed. eLife. https://doi.org/10.7554/eLife.16800
  • Moran, C., Richard, A., Wilson, K., Twomey, R., & Coroiu, A. (2023). I know it’s bad, but I have been pressured into it: Questionable research practices among psychology students in Canada. Canadian Psychology/Psychologie canadienne, 64(1), 12–24. https://doi.org/10.1037/cap0000326
  • Moshontz, H., Campbell, L., Ebersole, C. R., IJzerman, H., Urry, H. L., Forscher, P. S., Grahe, J. E., McCarthy, R. J., Musser, E. D., Antfolk, J., Castille, C. M., Evans, T. R., Fiedler, S., Flake, J. K., Forero, D. A., Janssen, S. M. J., Keene, J. R., Protzko, J., Aczel, B. & Chartier, C. R. (2018). The Psychological science accelerator: Advancing psychology through a distributed collaborative network. Advances in Methods and Practices in Psychological Science, 1(4), 501–515. https://doi.org/10.1177/2515245918797607
  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Du Sert, N. P., Simonsohn, U., Wagenmakers, E. J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021–0029. https://doi.org/10.1038/s41562016-0021
  • Nicholas, D., Boukacem-Zeghmouri, C., Abrizah, A., Rodríguez-Bravo, B., Xu, J., Świgoń, M. … Herman, E. (2019). Open science from the standpoint of the new wave of researchers: Views from the scholarly frontline. Information Services & Use, 39(4), 369–374. https://doi.org/10.3233/ISU-190069
  • Norris, E., & O’Connor, D. B. (2019). Science as behaviour: Using a behaviour change approach to increase uptake of open science. Psychology & Health, 34(12), 1397–1406. https://doi.org/10.1080/08870446.2019.1679373
  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., & Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374
  • Obels, P., Lakens, D., Coles, N. A., Gottfried, J., & Green, S. A. (2020). Analysis of open data and computational reproducibility in registered reports in psychology. Advances in Methods and Practices in Psychological Science, 3(2), 229–237. https://doi.org/10.1177/2515245920918872
  • O’Connor, D. B., Aggleton, J. P., Chakrabarti, B., Cooper, C. L., Creswell, C., Dunsmuir, S., Fiske, S. T., Gathercole, S., Gough, B., Ireland, J. L., Jones, M. V., Jowett, A., Kagan, C., Karanika‐Murray, M., Kaye, L. K., Kumari, V., Lewandowsky, S., Lightman, S., Malpass, D., & Wykes, T.… Armitage, C. J. (2020). Research priorities for the COVID‐19 pandemic and beyond: A call to action for psychological science. British Journal of Psychology, 111(4), 603–629. https://doi.org/10.1111/bjop.12468
  • Orben, A. (2019). A journal club to fix science. Nature, 573(7775), 465–465. https://doi.org/10.1038/d41586-019-02842-8
  • Pashler, H., & Wagenmakers, E. J. (2012). Editors’ introduction to the special section on replicability in psychological science: A crisis of confidence? Perspectives on Psychological Science, 7(6), 528–530. https://doi.org/10.1177/1745691612465253
  • Pownall, M., Talbot, C. V., Henschel, A., Lautarescu, A., Lloyd, K. E., Hartmann, H., & Siegel, J. A. (2021). Navigating open science as early career feminist researchers. Psychology of Women Quarterly, 45(4), 526–539. https://doi.org/10.1177/03616843211029255
  • R Core Team. (2021). R: A language and environment for statistical computing. Statistical Computing. URL. https://www.R-project.org/
  • Schöpfel, J., Prost, H., Jacquemin, B., & Kergosien, E. (2020). PhD training on open science in French universities. Cadernos BAD, 1, 115–129.
  • Stürmer, S., Oeberst, A., Trötschel, R., & Decker, O. (2017). Early-career researchers’ perceptions of the prevalence of questionable research practices, potential causes, and open science. Social Psychology, 48(6), 365–371. https://doi.org/10.1027/1864-9335/a000324
  • Thibault, R. T., Bailey-Rodriguez, D., Bartlett, J. E., Blazey, P., Green, R. J., Pownall, M., & Munafo, M. (2023). A Delphi study to strengthen research methods training in undergraduate psychology programmes. PsyArXiv. https://doi.org/10.31234/osf.io/gp9aj
  • Toribio-Flórez, D., Anneser, L., de Oliveira-Lopes, F. N., Pallandt, M., Tunn, I., & Windel, H. (2021). Where do early career researchers Stand on open science practices? A survey within the max Planck Society. Frontiers in Research Metrics and Analysis, 5, 586992. https://doi.org/10.3389/frma.2020.586992
  • Whitaker, T., & Guest, B. (2020). #bropenscience is broken science. The Psychologist, 33(2), 34–38. https://doi.org/10.1080/02690055.2019.1577602
  • Wickham, H. (2016). ggplot2: Elegant graphics for data analysis. Springer-Verlag New York. https://doi.org/10.1007/978-3-319-24277-4
  • Wickham, H., François, R., Henry, L., & Müller, K. (2022). Dplyr: A grammar of data manipulation. R package version 1.0.9. https://CRAN.R-project.org/package=dplyr