175
Views
0
CrossRef citations to date
0
Altmetric
Research Article

A tale of two formats: Graduate students’ perceptions and preferences of interactivity in Responsible conduct of research education

ORCID Icon & ORCID Icon
Received 26 Dec 2023, Accepted 22 Apr 2024, Published online: 09 May 2024

ABSTRACT

Background

The significance of Responsible Conduct of Research (RCR) education in higher education is well-acknowledged. However, the lack of interactivity in online RCR courses remains a concern for course designers and instructors. This research aims to identify types of interactivity embedded in RCR courses and examine graduate students’ perceived interactivity in different course formats (online versus face-to-face) by two distinct samples.

Methods/Materials

Study one, involving 191 participants, identified the model construct of the Learner Perceptions of Interactivity Scale for RCR. The result indicated a 15-item scale characterized by three factors: self-control, human-interaction, and information-access. Study two, involving a sample of 390 individuals who received both formats of RCR instruction, confirmed the instrument’s reliability and explored students’ perceptions of interactivity types within the two formats.

Results

Notably, students in Study 2 perceived a higher degree of human interaction in the face-to-face format while attributing more significance to self-control and information access in the online course. Approximately 80% of the students expressed a preference for a fully online course if given another opportunity to choose or recommend a format. This preference was attributed to their inclination toward more control and access, underscoring the significance of these elements in shaping their learning experiences.

1. Introduction

Responsible conduct of research (RCR) is the basis for all research endeavors to ensure the quality and significance of research contributions for both human and scientific development. Consequently, RCR education has evolved into a pivotal issue within higher education in recent years, and in some way, is demonstrating a positive effect (Kalichman Citation2013; Steele et al. Citation2016; Steneck Citation2006). With the rapid prevalence of online education, the potential of web-based learning for RCR courses has naturally emerged. Kalichman (Citation2014) has highlighted the adoption of a self-paced online format for RCR courses in some US institutions and universities, utilizing programs like the Collaborative Institutional Training Initiative (CITI) (Braunschweiger and Goodman Citation2007). However, despite the popularity and accessibility of online learning in higher education, as described by Dumford and Miller (Citation2018), a degree of skepticism persists among certain funding agencies regarding the exclusive implementation of online RCR instruction (e.g., National Institutes of Health Citation2021).

Although courses are gradually reverting to face-to-face teaching in the post-pandemic era, online instruction has emerged as a viable alternative on campuses. This is particularly relevant when RCR courses are mandatory for every graduate student in most Taiwanese universities, featuring a binary grading system (pass/fail). Online learning presents a potential solution or challenges posed by limited instructor availability and constrained timeslots in traditional classroom settings.

While the current research does not focus on learning achievement, the rationale behind the establishment of RCR instruction requirements, coupled with the influence of different course formats on learning effectiveness, may be attributed to the embedded interactivity (Todd et al. Citation2017). It is crucial to consider learners’ opinions and needs, given that they are the ultimate beneficiaries of teaching, irrespective of pandemic considerations (Lazarevic and Bentz Citation2021; Tratnik, Urh, and Jereb Citation2019). Moreover, we agree with the investigations conducted by Francesca et al. (Citation2023) concerning early career researchers, emphasizing that training in RCR is crucial for many who are either embarking on or are currently engaged in academic research. For this reason, our study primarily focuses on graduate students as research subjects.

The primary objective of this research is dual in nature: first, to discern the types of interactivity embedded in RCR online courses; second, to delve into graduate students’ perceptions of interactivity in different RCR course formats and to assess whether these perceptions influence their decisions regarding future RCR course preferences. Two distinct studies were conducted to attain the objectives. The significance of this research lies not only in addressing concerns about the adequacy of RCR online instruction but also in providing an alternative perspective – interactivity – to scrutinize the design of online RCR courses.

2. Literature review and research questions

2.1. Delivery format of RCR education

According to reviews of RCR education, there is a notable upsurge in the number of courses offered along with a variety of delivery formats, which may have potential impacts on the effectiveness of instruction (Antes et al. Citation2009; Mulhearn et al. Citation2017; Watts et al. Citation2017). Yet, owing to technological advancements and an absence of a globally agreed-upon set of best practices in RCR education, the instruction blueprint and delivery formats tend to differ from one university to another (Abdi et al. Citation2021; Barak and Green Citation2020). Results of previous studies exhibit the significance of scrutinizing existing delivering formats in RCR education. Traditionally, the most common mode of RCR instruction has been face-to-face, in which instructors must make deliberate choices about the structure, content, and learning activities (Antes Citation2014).

As e-learning gained its popularity across universities and institutions in recent years, online RCR education has become increasingly prevalent. A recent study shows that conventional online learning enhances learners’ research ethics awareness and knowledge understanding. Yet, learners tend to gain more confidence applying the acquired knowledge when the course engages them in online activities and discussions (Barak and Green Citation2020). Whether conducted online or in a face-to-face setting, some form of interaction is deemed essential throughout the RCR learning process.

While instructors undoubtedly hold a crucial role in traditional RCR instruction, it is worth questioning whether human interaction represents the sole type of interactivity that can be incorporated into courses to ensure a satisfactory learning experience. Todd et al. (Citation2017) conducted a thorough meta-analytic comparison of studies involving face-to-face, online, and hybrid (blended) delivery formats. According to their findings, online RCR courses offer several advantages, including nonlinear structures, cognitive flexibility, immediate trainee feedback, and easy-to-revise content. On the other hand, face-to-face courses excel in human interaction and facilitating in-depth, meaningful communication, which proves beneficial for learning complex tasks (Todd et al. Citation2017). Hybrid courses, blending elements of both formats, appear to leverage the strength of each and thus can be particularly effective when the content is well-defined and designed (i.e., instructional vs process-based). Nevertheless, in a later meta-analysis study conducted by Katsarov et al. (Citation2022), blended courses tend to be less effective than pure online or face-to-face courses. Upon reviewing these inconsistent comparison results, it becomes evident that various types of interactivity other than human interaction may also help engage students in the learning process. Consequently, one of the research goals is to precisely identify which type(s) of interactivity can be embedded in online courses to facilitate and enhance the learning experience.

Moreover, special attention should be given to the audience’s nature and professional experience levels when crafting the format and structure of RCR instruction (Antes Citation2014). This entails taking into account the learners’ experience with technology and perceived technological affordances. In Taiwan, the primary targets of RCR instruction are advanced graduate students and individuals at higher levels of education (i.e., post-docs). These learners are usually heavy technology users and possess experience with a diverse range of Internet applications, such as online learning. Understanding the technological adeptness and prior exposure to online learning among these learners is integral to tailoring the delivery of RCR education effectively.

2.2. Interactivity in the course format

Indeed, interactivity holds significant importance in the learning process, influencing learners, instructors, and the learning content; it has emerged as a pivotal aspect in the development of any courses, whether they are delivered in face-to-face settings or through online platforms (Chou, Peng, and Chang Citation2010; Wei, Peng, and Chou Citation2015). As Reeves (Citation2012) stated, “[A]ll learning is interactive in the sense that learners interact with content to process, tasks to accomplish, and problems to solve with the goal of constructing improved cognitive, affective, conative, and psychomotor learning outcomes” (p. 1602). Domagk et al. (Citation2010) also indicated that interactivity within a multimedia learning system is described as a reciprocal engagement between a learner and the system, where “the [re]action of the learner is dependent upon the [re]action of the system and vice versa” (p. 1032). These early researches reiterated the importance of interactivity, emphasizing its role as a fundamental attribute within learning environments. They further highlighted how interactivity enriches the quality of educational materials, actively engages learners in the learning process, and significantly facilitates effective learning outcomes.

Conventional online learning has faced criticism due to its lack of interactivity and limited opportunities for active engagement (Daily-Hebert Citation2018; Kalichman Citation2014; Todd et al. Citation2017). To successfully embed interactivity seamlessly in an online learning environment, one perspective involves considering technology affordance. For example, Chou (Citation2003) characterized interactivity within a system by emphasizing technology affordances for human action using the following factors: choice, adaptability, playfulness, monitoring information use, and facilitation of interpersonal communication. Chou categorized four types of interaction – learner-interface, learner-content, learner-instructor, and learner-learner – and outlined a list of all possible interactive functions applicable to web-based learning systems to achieve interactivity. Renkl and Atkinson (Citation2007) defined interaction in computer-based learning as a reciprocal process in which the “action” taken by learners and their learning environment are mutually dependent. These researchers further pointed out that such dependence derives from either technical features of learning environments or learners’ cooperation facilitated by technology (Renkl and Atkinson Citation2007, 235). In essence, both system-based functions (like learner-interface and learner-content) that empower learner control and monitor the learning process, as well as user-initiated functions (like learner-instructor and learner-learner) supporting human interaction, are key for creating an interactive learning system interactive. Chou et al. (Citation2010) provided empirical evidence that college students were most familiar with, and made most frequent use of, any functions enabling them to monitor their learning process (e.g., assignment handling, grade-status tracking, material-viewed tracking) and facilitating interpersonal interaction (e.g., online forum) in online course management systems.

From a learning perspective, Kennedy (Citation2004) posited that a multimedia learning environment is not inherently interactive but rather that the interactive features possess the potential to engage learners. These features require active utilization by learners to meaningfully respond to system activities. Domagk et al. (Citation2010) further elaborated on two key interactive features within multimedia learning systems that are conductive to learning. The first feature involves learner control, including manipulation of pacing, content, and representation. Opportunities for learner control empower learners to engage in behavioral activities, subsequently facilitating and directing cognitive and metacognitive activities. The second feature entails guidance, encompassing feedback, reflection prompts, and direct advice, aiming at fostering cognitive and/or metacognitive activities. Highlighting the integration of these interactive elements into online courses, it becomes imperative not only to offer interactive features but also to incorporate them seamlessly into learning tasks (Wei, Peng, and Chou Citation2015). In short, learners are encouraged not just to respond but also to actively engage with these interactive features for effective learning.

The preceding perspectives indicate the value and importance of interactivity and how various interactive functions or features can be employed to enhance the online learning experience. Notably, Lustria (Citation2007) presented an alternative perspective, focusing on user perceptions. The argument posits that without students’ full awareness and use, interactive features will become futile, and the desired learning outcomes will remain unachieved. This perspective may be particularly significant for the RCR target learners in this research – graduate students – who are typically computer/Internet savvy and have varied learning experiences. While prior research often approached interactivity from the instructor’s or designer’s perspective (Daily-Hebert Citation2018) or compared different student groups (Brockman et al. Citation2020; Lazarevic and Bentz Citation2021), the present research takes a user-centric approach, prioritizing the perspective of the learners. This research attempts to provide empirical evidence by focusing on a consistent group of students who have experienced both online and face-to-face methods and analyzed their responses. The scarcity of investigations into the student viewpoint on interactivity in comparing different RCR delivery methods underscores the significance of this research.

In sum, the current research aims to explore how graduate students perceive different types of interactivity embedded in two distinct formats of RCR instruction – online versus face-to-face – after receiving both modes of instruction.

2.3. Research questions

The specific research questions of this research are as follows:

  1. What types of interactivity can be embedded in RCR courses to cater to diverse interaction needs?

  2. How do students perceive the difference in interactivity between fully online and face-to-face RCR courses?

  3. What interactivity-related factors can influence students’ choices of RCR course format?

  4. What interactivity-related factors can influence students’ recommendations for RCR course format to their schoolmates?

3. Methodology

This research consists of two studies. Initially, the authors developed an instrument aimed at gauging student’s perceived interactivity following a comprehensive literature review. This instrument was pilot-studied with a 191-student sample (Study 1), facilitating an identification of interaction types and a refinement of the instrument. One year later, the second study (Study 2) was conducted with a sample size of 390 students. The subsequent study aimed to investigate differences in students’ perceived interactivity between fully online and face-to-face RCR courses. Additionally, it sought to explore students’ preferences and recommendations regarding course formats to their peers.

3.1. Learning and research context

This research examined two RCR course formats in which the research instrument was administered. The first one is an asynchronous, fully online RCR course provided by the Taiwan Ministry of Education. It was primarily developed by the authors and associates and presented as a series of web-based multimedia (with texts, graphs, figures, and animations) units in traditional Chinese. Aligned with the nine suggested core RCR domains (DuBois et al. Citation2010) and five areas of literacy-based research integrity assessment framework (Chou and Lee Citation2022), the course content includes topics such as research ethics/integrity principles, policies, guidelines, misconduct, and best practices. Additionally, this course addresses common student misunderstandings of RCR (Pan and Chou Citation2015). To enhance interactivity, as suggested by Domagk et al. (Citation2010) and Wei et al. (Citation2015), this course includes interactive features, such as flexibility in learning time/location/order of units, multiple exam attempts, quizzes with feedback, reflection prompts, multimedia examples, and related websites.

shows a screenshot of a select unit in English for non-native Chinese students. Each course unit takes students approximately 20 minutes to complete at a normal pace. Graduate students seeking certification of completion before their thesis or dissertation defenses must finish all 18 units (approximately 6 hours) and pass a final online exam. The students can retake the final exam a limited number of times in one day, with different exam questions for each attempt.

Figure 1. Sample screen of the online course unit with interactive features. Permission obtained from Center for Taiwan Academic Research Ethics Education (Citation2024).

Figure 1. Sample screen of the online course unit with interactive features. Permission obtained from Center for Taiwan Academic Research Ethics Education (Citation2024).

The second course format is a traditional physical classroom setting; that is, a four-hour face-to-face RCR lecture was designed and delivered by the same instructor (one of the authors) in a classroom with the same content slides highlighting the key points in the online course. This lecture also provided examples/cases, two animations (other than those in the 18 units), some pop-up questions for students’ instant response, and a Q&A session at the end. This special arrangement (the same instructor giving the same content) was designed to minimize the possibility of variations in students’ evaluation of the interactivities of face-to-face instruction caused by different lecturers.

3.2. Study 1

In Study 1, a research instrument, the Learner Perceptions of Interactivity Scale for RCR (LPoIS-RCR), was piloted with a cohort of 191 students. To achieve the goal, this self-developed scale aimed to evaluate student-perceived interactivity across two RCR course formats (fully online and face-to-face). The research instrument underwent a thorough review and received approval from the Research Ethics Committee for Human Subject Protection at the authors’ University, ensuring compliance with ethical standards during the distribution process.

The LPoIS-RCR presents all interactive features of the course format. Specifically, the concept of “structured alternative format” (Harter Citation1982, 89) was employed by designing a 10-point Likert scale with two ends (fully online format and face-to-face format). Participants were asked to indicate the extent of their perceptions by marking the scale in the appropriate place (the scale ranged from 1–5 in two directions). As shown in , higher perceived interactivity in the fully online format compared to face-to-face would be represented by a leftward mark for an item like “Decide where and when to learn” (as indicated by marking 4 on the left set of numbers). When performing the analysis, all the numbers on the left set (toward a fully online course) were counted as negative. Consequently, a positive mean score for an item represented a higher perception of interactivity in the face-to-face course, and conversely for a negative mean score.

Figure 2. Example for how to respond to each LPoIS-RCR statement/item on perceived interactivity.

Figure 2. Example for how to respond to each LPoIS-RCR statement/item on perceived interactivity.

In considering the interactivity in a typical face-to-face classroom setting and the technology affordance in Taiwan RCR online courseware (as described below), the present study proposes a framework of three types of interaction: learner-control (instead of learner-computer or learner-interface), learner-content, and learner-instructor/learner-learner. This framework further serves as a basis for the present study’s categorization of all interactive features that address various interactive needs for RCR learning. Therefore, this section of the LPoIS-RCR consists of 20 interactive feature items in three interaction types:

  1. Learner-control (6 items): regulate learning pace, decide where and when to learn, control my own learning topics, gain more opportunities to (re)take the exam, monitor learning progress, and be involved in learning activities.

  2. Learner-content (7 items): access to relevant information and resources (e.g., books), access to relevant information regarding the assignments, access to extra learning resources, access to other courseware units, acquisition of learning materials with text/graphics/comics/other formats, acquisition of learning materials in animation format, and answer to the course satisfaction questionnaire.

  3. Learner-instructor/Learner-learner (7 items): more opportunities to interact directly with the instructor, more opportunities to discuss with peers, more opportunities to contact the course developers, access to course Q&A more easily, more opportunities to give feedback on the learning content, more access to course announcement/post, more opportunities to express opinions on the course.

After finishing the draft of the LPoIS-RCR, the authors held a focus group with two instructors with online teaching experience and three graduate students with online learning experience. The primary purpose of the focus group was to ensure the potential respondents would understand the instrument format and items. The wording and layout of the instrument were revised accordingly.

As shown in , sample 1 consists of 191 graduate students who took the abovementioned fully online RCR course required by their graduate programs and have passed the course. Because the graduate students registered in this online course came from all over Taiwan, the authors used e-mail to invite anyone who had certificates to voluntarily participate in Study 1. Among the 191 respondents who completed the online version of the LPoIS-RCR, 75 (39.3%) were female, and 116 (60.7%) were male. The students’ research areas included humanities and social science, science, computer science, engineering, management, biological science, and technology. There were 180 (94.2%) students enrolled in master’s degree programs and 11 (5.8%) students enrolled in doctorate degree programs.

Table 1. Descriptive features of graduate student participants sample 1 (N = 191).

3.3. Study 2

The students comprising sample 2 were recruited to participate in this research study one year after sample 1. After successful completion of the fully online RCR course (same as sample 1), sample 2 students were required to attend an additional 4-hour face-to-face RCR lecture by their respective graduate programs. There might be a gap of one to two weeks between online and face-to-face classes, depending on the course design or requirements. Following the lecture, students were invited to answer the paper version of the LPoIS-RCR (as factor-validated by sample 1). The first section required them to rate their perceived interactivity in both online and face-to-face formats, while the additional second section asked them to choose the course format if they needed to retake or take more course units and to recommend the course to their schoolmates. A total of 390 graduate students, as sample 2, completed the revised LPoIS-RCR. The participants were from three different universities in Taiwan. Among these participants, 154 (39.5%) were female, and 236 (60.5%) were male. The students’ research areas included business administration, biological science and technology, and engineering. Specifically, 357 (91.5%) students enrolled in master’s degree programs, and 33 (8.5%) students enrolled in doctorate degree programs. The demographic features of the graduate student participants are available in .

Table 2. Descriptive features of graduate student participants sample 2 (N = 390).

3.4. Statistical analysis

The statistical analyses were carried out using SPSS 25.0 software. Descriptive statistical analyses, Mann – Whitney’s U-tests, and logistic regression analysis were performed. Descriptive statistics were shown as numbers and percentages for discrete variables and as means and standard deviations for continuous variables. Numerical variables providing nonparametric standards between two groups were analyzed using Mann – Whitney’s U-tests. Simple logistic regression analysis was also conducted to examine factors related to students’ choices and recommendations of an RCR course format.

The construct validity of LPoIS-RCR was evaluated through exploratory factor analysis (EFA by sample 1) and confirmatory factor analysis (CFA by sample 2) using STATA 16.0.

4. Results

4.1. Study 1: Validation of the LPoIS-RCR

An Exploratory factor analysis (EFA) for sample 1 (N = 191) was implemented to establish LPoIS-RCR instrument validity. The KMO coefficient yielded a value of 0.91. Bartlett’s test of sphericity resulted in a statistically significant outcome (χ2 = 2950.64, df = 190, p < .001), indicating the data were suitable for the factor analysis. Employing a maximum likelihood estimator and varimax rotation, an EFA was executed. The authors deliberately deleted five items with cross-loadings or low loadings (λ < 0.40) after closely checking the pattern coefficients in the rotated matrix.

After the deletion of those items, fifteen items were retained. The statistics still showed that factor analysis was appropriate (KMO = 0.88, Bartlett’s test of sphericity χ2 = 2084.33, df = 105, p < .001). Three factors were extracted, and this model explained 72.03% of the variance. From the rotated factor matrix, each item loaded strongly on its designated factor and weakly on the other factors, indicating acceptable convergent and discriminant validity (factor loadings ranged from 0.55 to 0.91). The three factors were self-control (SC, renamed from the learner-control interaction), human-interaction (HI, renamed from the learner-instructor/learner-learner interactions), and information-access (IA, renamed from learner-content interaction).

In regard to the intercorrelation of statements, Cronbach’s alpha was used to estimate the internal consistency of the LPoIS-RCR statements. The 15-statement scale had excellent internal consistency reliability with a Cronbach’s alpha of 0.91. As shown in , the reliability of each factor was as follows: self-control (SC), 0.88; human-interaction (HI), 0.90; and information-access (IA), 0.89. The following thresholds were used for Cronbach’s alpha coefficient, as suggested by George and Mallery (Citation2010): alpha > 0.9 (Excellent) and alpha > 0.8 (Good). In short, the EFA and Cronbach’s alpha coefficients indicated that the constructed LPoIS-RCR had acceptable validity and internal consistency. STATA 16.0 was used on sample 1 for the validity analysis. presents the mean, standard deviation, subscale reliability, and factor loading of each item on the LPoIS-RCR.

Table 3. Factor loadings of the 15 statements across LPoIS-RCR’s three factors by sample 1.

4.2. Study 2: Verification of the LPoIS-RCR and students’ perceived interactivity in RCR course

4.2.1. Confirmatory factor analysis (CFA)

CFA was used for sample 2 (N = 390) to determine whether the fit of the model data (between the item-factor structures) was consistent with the results of the EFA. Since different indices provide different estimates of how well the data fit a priori hypothesized model, the authors decided to assess the goodness-of-fit by using the chi-square test, RMSEA, SRMR, and CFI (Browne and Cudeck Citation1992; MacCallum, Browne, and Sugawara Citation1996). In addition, STATA 16.0 was used on sample 2 for the validity analysis.

The chi-square test statistic was significant (χ2 = 580.692, df = 87, p < .001) partly because of the large sample size or number of items. Consequently, the authors relied on other indices to support the model’s validity (CFI = 0.88, RMSEA = 0.12, SRMR = 0.09). Although the goodness-of-fit indices for the hypothesized model were not sufficient, the factor structural model of the LPoIS-RCR was kept. presents the factor structural model of the LPoIS-RCR. SC1-SC5 represent the items from the self-control (SC) factor. HI1–HI5 represent the items from the human-interaction (HI) factor. IA1–IA5 represent the items from the information-access (IA) factor.

Figure 3. Three-factor structural model of the LPoIS-RCR based on sample 2.

Figure 3. Three-factor structural model of the LPoIS-RCR based on sample 2.

As shown in , the component reliability (CR) values for the three factors were 0.88, 0.90, and 0.90, indicating adequate internal consistency of the measurement model (Bagozzi and Yi Citation1988). Additionally, the average variance extracted (AVE) for each factor was measured at 58.9%, 63.8%, and 64.1%, indicating good convergent validity of the scales (Bagozzi and Yi Citation1988; Hair et al. Citation2010).

Table 4. AVE and CR of CFA of the LPoIS-RCR by sample 2.

4.2.2. Student’s perceived interactivity of the RCR course across learning formats

shows descriptive statistics for each statement of LPoIS-RCR by sample 2. As indicates, the items with the highest mean scores in the SC, HI, and IA factors were “Decide where and when to learn” (M = −3.15, SD = 2.74), “Gain more opportunities to interact directly with the instructor” (M = 2.17, SD = 3.09), and “Acquire learning materials in animation format” (M = −2.09, SD = 2.89), respectively.

Table 5. Descriptive statistics of the 15 statements of the LPoIS-RCR across three factors.

shows a bar graph for the average score of each factor in the LPoIS-RCR; the graph indicates that sample 2 students perceived a higher degree of interactivity in the SC (M = −2.25, SD = 2.43) and IA factors (M = −1.46, SD = 2.62) in an online course than in a face-to-face course. In contrast, students perceived a higher degree of interactivity in the HI factor (M = 1.27, SD = 2.70) in a face-to-face course than in an online course.

Figure 4. Bar charts for the average score of each factor in LPoIS-RCR (N = 390).

Figure 4. Bar charts for the average score of each factor in LPoIS-RCR (N = 390).

4.2.3. Students’ preference to take the online course or face-to-face course

As shown in , the authors asked the sample 2 students which format of the RCR course they would prefer. The results indicated that 317 (81.3%) students still wanted to take an online RCR course, and 73 (18.7%) students preferred a face-to-face RCR course. In addition, if students had a chance to offer a recommendation to their schoolmates, 312 (80.0%) students would have recommended an online RCR course, and 78 (20.0%) students would have recommended a face-to-face RCR course.

Table 6. Results for students’ decisions on RCR course format.

4.2.4. Comparison of groups with different RCR course format preferences and recommendations

The Shapiro-Wilk normality tests were conducted to perform the normality test with the samples of groups with different RCR course format preferences. Since the data were not normally distributed, this research used non-parametric statistical tests (Mann-Whitney U-test), accepting significance when p < 0.05 in a two-tailed test (Siegel Citation1956). The results revealed statistically significant differences emerged between online group and face-to-face group in the factor of self-control (Mann – Whitney’s U-tests, U = 5560.50, p < .001), human-interaction (Mann – Whitney’s U-tests, U = 7545.50, p < .001), and information-access (Mann – Whitney’s U-tests, U = 5687.50, p < .001).

As shown in , the students who preferred the online format perceived more interactivity with respect to self-control and information-access than students who preferred the face-to-face format. In contrast, students who preferred the face-to-face format perceived more interactivity with respect to human-interaction than students who preferred the online format.

Table 7. Descriptive statistics and Mann–Whitney’s U-tests of three LPoIS-RCR factors based on RCR course format preferences.

Again, using Mann-Whitney U tests to compare the perceptions of interactivity between those who recommended the online format and those who recommended the face-to-face format, as shown in , results revealed statistically significant differences emerged between the online group and face-to-face group in the factor of self-control (Mann – Whitney’s U-tests, U = 6057.00, p < .001), human-interaction (Mann – Whitney’s U-tests, U = 7947.50, p < .001), and information-access (Mann – Whitney’s U-tests, U = 6224.50, p < .001). The students who recommended the online format perceived more interactivity with respect to self-control and information-access than students who recommended the face-to-face format. In contrast, students who recommended the face-to-face format perceived more interactivity with respect to human-interaction than students who recommended the online format.

Table 8. Descriptive statistics and Mann–Whitney’s U-tests of three LPoIS-RCR factors based on the recommended course format.

4.2.5. Prediction of course format by factors of student-perceived interactivity

A logistic regression analysis was also conducted to determine which factors of perceived interactivity could predict students’ choices of an RCR course format in the future and students’ recommendations to their schoolmates. The results are shown in . The logistic regression coefficient (B), standard error (S.E.), Wald test, and odds ratio for each of the predictors indicated that model specification provided a reasonably good fit of the data and was acceptable for predicting graduate students’ preference for an RCR course format (Omnibus χ2 = 61.531, p < .001, Nagelkerke’s R2 =.221) and for predicting their recommendations to others (Omnibus χ2 = 68.242, p < .001, Nagelkerke’s R2 =.236). Two logit models also correctly predicted the same 82.2% of the sample observations.

Table 9. Logit model estimates of factors determining students’ choice of and recommendation of an RCR course format.

Additionally, as shown in , the coefficients of self-control, human-interaction, and information-access are positive and significant. This finding indicates that these three factors are significant predictors of students’ choice of course format in the future and significant predictors of their recommendations to other schoolmates.

5. Discussion

This research not only identified types of interactivity embedded in an RCR course but also examined graduate students’ perceived interactivity in different formats (fully online and face-to-face) of the course. Using the self-developed LPoIS-RCR, a total of 581 Taiwanese graduate students participated in two distinct samples. Sample 1 (191 students) completed a 6-hour online RCR course and was utilized in Study 1. Sample 2 (390 students) finished both the online course and a 4-hour face-to-face session, serving as the participant group of Study 2.

Study 1 aims to investigate research question one: What types of interactivity can be embedded in RCR courses to serve different needs for interaction? Drawing from literature, the authors’ teaching experiences, and the examination of technology affordances in online courseware, 20 possible interactive features were listed. After the EFA and CFA on two separate samples, the 15 items on the LPoIS-RCR loaded onto three factors: self-control, human-interaction, and information-access. The results highlighted the significance of not only human interaction, which typically occurs in face-to-face classrooms, but also students’ self-regulation of learning progress and their need for access to related online learning materials as important components of interactivity. As already noted by Bannan-Ritland (Citation2002), past reviews of interactivity predominantly emphasized student-to-student or instructor-to-student interaction and usually ignored other possible types of interaction. This research also acknowledges that aside from learner-learner and learner-instructor interactions, various instructional methods based on the rapid development of technology have fostered a wider range of interaction types. Therefore, this study seeks to introduce additional forms of interactions to further understand learners’ perceptions regarding these varied interaction types and interactivity levels. This research provides empirical evidence that students could perceive other types of interaction, such as control and access. Therefore, in line with the belief that higher levels of interactivity enhance learning (such as Wei, Peng, and Chou Citation2015), it is imperative for course developers to incorporate appropriate and sufficient interactive features when designing both online and face-to-face RCR courses. In particular, interactive features of learner-control, human interaction, and information access should be thoughtfully incorporated into the course design.

Study 2 addresses research questions two, three, and four. The second research question addresses how a consistent group of students perceive embedded interactivity in two different course formats of their RCR courses. Findings revealed that a fully online course not only provides a more flexible and convenient learning environment, allowing control over learning time and exam retakes, but also facilitates self-directed learning (i.e., monitoring their learning process and pace, controlling the learning topics and access to diverse RCR-related resources) and enables students to control their learning (i.e., related websites, assignments, real cases, multimedia materials). This finding supports Katsarov et al.’s (Citation2022) study that knowledge acquisition appears to be more beneficial in an online individual learning environment. In contrast, as expected, students perceived more interaction with instructors, peers, and course developers in the face-to-face course. They particularly valued the chances for Q&A sessions and opportunities for feedback on the learning content. These outcomes echo the study by Todd et al. (Citation2017), affirming the advantages of online courses in terms of flexibility and versatility while emphasizing the significance of meaningful human interactions and social activities in face-to-face settings. The present study provides empirical evidence from the student perspective, enriching the existing body of research.

The third and fourth research questions address the factors of interactivity influencing students’ choice and recommendation for an online or face-to-face RCR course. The current study revealed that all three factors (self-control, human-interaction, and information-access) contribute to students’ selection and recommendation for the RCR course format. In particular, over 80% of students would prefer a fully online course, with a similar proportion recommending them to themselves and others. Those favoring the online format emphasized the importance of self-control and information access, while proponents of the face-to-face format emphasized the significance of human interaction. The possible reason for these findings is that the participants, predominantly graduate students fulfilling RCR requirements in current research, prioritize flexibility and control in their professional coursework. Moreover, quick and easy access to a variety of RCR-related learning resources further accentuates the appeal of online courses. Nevertheless, approximately 20% of students preferred the face-to-face course format, either for themselves or for others, because of its emphasis on human interactions.

6. Conclusion and implications

Although the statistical results indicate a preference for online RCR courses, the primary aim of this research isn’t solely to highlight this preference but rather to examine deficiencies in interactivity within this format. The implications of our findings are threefold.

First, in Study 2, graduate students comparing online and face-to-face formats noted the absence of human interaction in the online setting. Addressing this lacking component becomes critical, especially within Taiwan’s current RCR education, which predominantly employs the online format. Strategies to facilitate engagement may include instructors regularly sending announcements or providing timely and individualized feedback to students (Bolliger and Martin Citation2018; Daily-Hebert Citation2018). Additionally, as online learning has become more proliferate in higher education, instructors need to understand more about the engagement and gains of students who only have the chance for online courses (Dumford and Miller Citation2018). This is particularly true during events like the COVID-19 pandemic, where most of the face-to-face or blended/flipped courses are not available or canceled. The results of this research can provide some insights from the instructional design perspective.

Secondly, researchers suggested that RCR education should integrate discussions and reasoning activities (e.g., Mumford et al. Citation2008; Watts et al. Citation2017; Zigmond and Fischer Citation2014). Designers of online RCR courses should leverage interactive features – synchronous/asynchronous human interaction, immediate feedback, multimedia examples, or even a robot – to facilitate high-quality activities and support systems for students and instructors (Domagk, Schwartz, and Plass Citation2010).

Lastly, this research calls attention to lacking interactive elements in the face-to-face format. Notably, the authors uncovered that graduate students perceived more self-control and information access in the online format, highlighting the potential benefits of adopting a student-centered approach that provides students more control and flexibility in terms of when, where, and what to learn. In addition, enriching face-to-face interactions with accessible resources and multimedia materials could significantly enhance the interactivity. Considering these insights, a blended format would probably emerge as an optimal solution, offering flexibility, choice and adaptability, if online resources are available and the teaching faculty is sufficient (Kay, MacDonald, and DiGiuseppe Citation2019; Todd et al. Citation2017).

7. Limitations and future research

This research reveals limitations that offer insights into future research directions. First, it focused solely on the aspect of interactivity in the RCR course from the viewpoint of graduate students. Although students seemed to recognize the interactive features of the online format in this research, gaining a more profound comprehension of whether the students effectively use or leverage these features could be pivotal in refining instructional strategies for enhanced learning outcomes. In particular, this research did not delve into the vital aspect of learning outcome – effectiveness (Todd et al. Citation2017). While the authors believe a higher degree of interactivity may facilitate learning, it is unclear whether more interactive features embedded in online RCR courses truly promote effective learning. Future research endeavors should address more aspects of the learning process and outcomes, such as effectiveness, awareness, knowledge acquisition, practice of, and attitudes toward research integrity and RCR, as demonstrated by Francesca et al. (Citation2023) in their study of early-career researchers. A more comprehensive understanding of online RCR course dynamics can be achieved by doing so.

Second, the current research only compared interaction types and students’ evaluations of interactivity levels of two course formats: a text/visual-based online course and a teacher-based face-to-face lecture. Therefore, future studies could benefit from incorporating a broader range of instructional formats. Possible formats include the Online-Merge-Offline (OMO) model (Huang et al. Citation2021), which integrates Open Educational Practices with real-time learning spaces, Massive Open Online Courses (MOOCs), encompassing lecture videos, quizzes, interactive web assignments, discussion forums, and the OpenCourseWare (OCW) approach, where teachers’ lectures are recorded and publicly available online. By conducting further comparisons among these formats, future research may explore variations in students’ perceptions of the interaction styles and the level of interactivity each format provides. This exploration would shed light on how diverse instructional formats/strategies affect learners’ engagement and educational outcomes.

Third, this study primarily focused on developing an instrument (i.e., LPoIS-RCR) to measure students’ perceptions and preferences of interaction types and interactivity levels. Two samples were used not only for validation and verification of the instrument but also to provide empirical findings of students’ perceptions and preferences. However, this study did not adopt a rigid experimental approach, such as the inclusion of a control group and pre-post test/questionnaire design. It is suggested that future studies incorporate experimental methods to enhance control and establish cause-effect relationships.

Lastly, this study is limited by variations in student characteristics, including learning behaviors, personal characteristics, and learning preferences, which differ significantly across universities and graduate programs. Accordingly, it is recommended that future research should aim to expand the participant and instructor pool across different institutions to account for this variability. Additionally, subsequent studies could explore the potential impact of instructors’ attributes, teaching practices, and instructional methods on students’ perceptions of interactivity, their preferences, and, ultimately, their educational outcomes.

Research involving human participants and/or animals

This research involved human participants. Ethical approval was obtained by the Research Ethics Committee for Human Subject Protection at National Chiao Tung University, Taiwan (Application No. NCTU-REC-106-002-m).

Informed consent

An informed consent was issued to each student participant before participation in the research.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the National Science and Technology Council, Taiwan, under Grants MOST105-2511-S-009-008-MY3 and MOST110-2511-H-A49-008-MY4.

References

  • Abdi, S., D. Pizzolato, B. Nemery, and K. Dierickx. 2021. “Educating PhD Students in Research Integrity in Europe.” Science and Engineering Ethics 27 (5). https://doi.org/10.1007/s11948-021-00290-0.
  • Antes, A. L. 2014. “A Systematic Approach to Instruction in Research Ethics.” Accountaiblity in Research 21 (1): 50–67. https://doi.org/10.1080/08989621.2013.822269.
  • Antes, A. L., S. T. Murphy, E. P. Waples, M. D. Mumfor, R. P. Brown, S. Connelly, and L. D. Devenport. 2009. “A Meta-Analysis of Ethics Instruction Effectiveness in the Scineces.” Ethics & Behavior 19 (5): 379–402. https://doi.org/10.1080/10508420903035380.
  • Bagozzi, R. P., and Y. Yi. 1988. “On the Evaluation of Structural Equation Models.” Journal of the Academy of Marketing Science 16 (1): 74–94. https://doi.org/10.1007/bf02723327.
  • Bannan-Ritland, B. 2002. “Computer-Mediated Communication, Elearning, and Interactivity: A Review of the Research.” The Quarterly Review of Distance Education 3 (2): 161–179. https://www.learntechlib.org/p/95271/.
  • Barak, M., and G. Green. 2020. “Novice Researchers’ Views about Online Ethics Education and the Instructional Design Components that may Foster Ethical Practice.” Science and Engineering 26 (3): 1403–1421. https://doi.org/10.1007/s11948-019-00169-1.
  • Bolliger, D. U., and F. Martin. 2018. “Instructor and Student Perceptions of Online Student Engagement Strategies.” Distance Education 39 (4): 568–583. https://doi.org/10.1080/01587919.2018.1520041.
  • Braunschweiger, P., and K. W. Goodman. 2007. “The CITI Program: An International Online Resource for Education in Human Subjects Protection and the Responsible Conduct of Research.” Academic Medicine 82 (9): 861–864. https://doi.org/10.1097/ACM.0b013e31812f7770.
  • Brockman, R. M., J. M. Taylor, L. W. Segars, V. Selke, and T. A. H. Taylor. 2020. “Student Perceptions of Online and In-Person Microbiology Laboratory Experiences in Undergraduate Medical Education.” Medical Education Online 25 (1): 1710324. https://doi.org/10.1080/10872981.2019.1710324.
  • Browne, M. W., and R. Cudeck. 1992. “Alternative Ways of Assessing Model Fit.” Sociological Methods & Research 21 (2): 230–258. https://doi.org/10.1177/0049124192021002005.
  • Center for Taiwan Academic Research Ethics Education. (2024). Conflict of Interest in Research [Online Learning Material]. Accessed April 29, 2024. https://ethics.moe.edu.tw
  • Chou, C. 2003. “Interactivity and Interactive Functions in Web-Based Learning Systems: A Technical Framework for Designers.” British Journal of Educational Technology 34 (3): 265–279. https://doi.org/10.1111/1467-8535.00326.
  • Chou, C., and Y.-H. Lee. 2022. “The Development of a Literacy-Based Research Integrity Assessment Framework for Graduate Students in Taiwan.” Science and Engineering Ethics 28 (6): 66. https://doi.org/10.1007/s11948-022-00401-5.
  • Chou, C., H. Peng, and C.-Y. Chang. 2010. “The Technical Framework of Interactive Functions for Course-Management Systems: Students’ Perceptions, Uses, and Evaluations.” Computers & Education 55 (3): 1004–1017. https://doi.org/10.1016/j.compedu.2010.04.011.
  • Daily-Hebert, A. 2018. Maximizing Interactivity in Online Learning: Moving Beyond Discussion Boards. (EJ1199230). ERIC. https://files.eric.ed.gov/fulltext/EJ1199230.pdf.
  • Domagk, S., R. N. Schwartz, and J. L. Plass. 2010. “Interactivity in Multimedia Learning: An Integrated Model.” Computers in Human Behavior 26 (5): 1024–1033. https://doi.org/10.1016/j.chb.2010.03.003.
  • DuBois, J. M., D. A. Schilling, E. Heitman, N. H. Steneck, and A. A. Kon. 2010. “Instruction in the Responsible Conduct of Research: An Inventory of Programs and Materials within CTSAs.” Clinical and Translational Science 3 (3): 109‐111. https://doi.org/10.1111/j.1752-8062.2010.00193.x.
  • Dumford, A. D., and A. L. Miller. 2018. “Online Learning in Higher Education: Exploring Advantages and Disadvantages for Engagement.” Journal of Computing in Higher Education 30 (3): 452–465. https://doi.org/10.1007/s12528-018-9179-z.
  • Francesca, G., S. Ceruti, S. Martini, M. Picozzi, M. Cosentino, and F. Marino. 2023. “Educating and Training in Research Integrity (RI): A Study on the Perceptions and Experiences of Early Career Researchers Attending an Institutional RI Course.” Journal of Academic Ethics. https://doi.org/10.1007/s10805-023-09497-1.
  • George, D., and P. Mallery. 2010. SPSS for Windows Step by Step: A Simple Guide and Reference 18.0 Update. 11th ed. Boston: Prentice Hall Press.
  • Hair, J. F., W. C. Black, B. J. Babin, and R. E. Anderson. 2010. Multivariate data analysis. 7th ed. New York: Pearson.
  • Harter, S. 1982. “The Perceived Competence Scale for Children.” Child Development 53 (1): 87–97. https://doi.org/10.2307/1129640.
  • Huang, R., A. Tlili, H. Wang, Y. Shi, C. J. Bonk, J. Yang, and D. Burgos. 2021. “Emergence of the Online-Merge-Offline (OMO) Learning Wave in the Post-COVID-19 Era: A Pilot Study.” Sustainability 13 (6): 3512. https://doi.org/10.3390/su13063512.
  • Kalichman, M. 2013. “A Brief History of RCR Education.” Accountability in Research 20 (5–6): 380–394. https://doi.org/10.1080/08989621.2013.822260.
  • Kalichman, M. 2014. “Rescuing Responsible Conduct of Research (RCR) Education.” Accountability in Research 21 (1): 68–83. https://doi.org/10.1080/08989621.2013.822271.
  • Katsarov, J., R. Andorno, A. Krom, and M. van den Hoven. 2022. “Effective Strategies for Research Integrity Training—A Meta-Analysis.” Educational Psychology Review 34 (2): 935–955. https://doi.org/10.1007/s10648-021-09630-9.
  • Kay, R., T. MacDonald, and M. DiGiuseppe. 2019. “A Comparison of Lecutre-Based, Active, and Flipped Classroom Teaching Approaches in Higher Education.” Journal of Computing in Higher Education 31 (3): 449–471. https://doi.org/10.1007/s12528-018-9197-x.
  • Kennedy, G. E. 2004. “Promoting Cognition in Multimedia Interactivity Research.” Journal of Interactive Learning Research 15 (1): 43–61. https://www.learntechlib.org/primary/p/4530/.
  • Lazarevic, B., and D. Bentz. 2021. “Student Perception of Stress in Online and Face-To-Face Learning: The Exploration of Stress Determinants.” American Journal of Distance Education 35 (1): 2–15. https://doi.org/10.1080/08923647.2020.1748491.
  • Lustria, M. L. A. 2007. “Can Interactivity Make a Difference? Effects of Interactivity on the Comprehension of and Attitudes Toward Online Health Content.” Journal of the American Society for Information Science and Technology 58 (6): 766–776. https://doi.org/10.1002/asi.20557.
  • MacCallum, R. C., M. W. Browne, and H. M. Sugawara. 1996. “Power Analysis and Determination of Sample Size for Covariance Structure Modeling.” Psychological Methods 1 (2): 130–149. https://doi.org/10.1037/1082-989X.1.2.130.
  • Mulhearn, T. J., L. M. Steele, L. L. Watts, K. E. Mewdeiros, M. D. Mumford, and S. Connelly. 2017. “Review of Instructional Approaches in Ethics Education.” Science and Engineering Ethics 23 (3): 883–912. https://doi.org/10.1007/s11948-016-9803-0.
  • Mumford, M. D., S. Connelly, R. P. Brown, S. T. Murphy, J. H. Hill, A. L. Antes, and L. D. Devenport. 2008. “A Sensemaking Approach to Ethics Training for Scientists: Preliminary Evidence of Training Effectiveness.” Ethics & Behavior 18 (4): 315–339. https://doi.org/10.1080/10508420802487815.
  • National Institutes of Health. (2021). “11.2.3.5 Responsible Conduct of Research.” https://grants.nih.gov/grants/policy/nihgps/html5/section_11/11.2.3_application_requirements_and_receipt_dates.htm.
  • Pan, S. J.-A., and C. Chou. 2015. “Using a Two-Tier Test to Examine Taiwanese Graduate students’ Misunderstanding of Responsible Conduct of Research.” Ethics & Behavior 25 (6): 500–527. https://doi.org/10.1080/10508422.2014.987921.
  • Reeves, T. C. 2012. “Interactive Learning.” In Encyclopedia of the Sciences of Learning, edited by N. M. Seel, 1602–1604. Springer. https://doi.org/10.1007/978-1-4419-1428-6_330.
  • Renkl, A., and R. K. Atkinson. 2007. “Interactive Learning Environments: Contemporary Issues and Trends. An Introduction to the Special Issue.” Educational Psychology Review 19 (3): 235–238. https://doi.org/10.1007/s10648-007-9052-5.
  • Siegel, S. 1956. Nonparametric Statistics for the Behavioural Sciences. New York: McGraw-Hill.
  • Steele, L. M., T. J. Mulhearn, K. E. Medeiros, L. L. Watts, S. Connelly, and M. D. Mumford. 2016. “How do We Know What Works? A Review and Critique of Current Practices in Ethics Training Evaluation.” Accountability in Research 23 (6): 319–350. https://doi.org/10.1080/08989621.2016.1186547.
  • Steneck, N. H. 2006. “Fostering Integrity in Research: Definitions, Current Knowledge, and Future Directions.” Science and Engineering Ethics 12 (1): 53–74. https://doi.org/10.1007/s11948-006-0006-y.
  • Todd, E. M., L. L. Watts, T. J. Mulhearn, B. S. Torrence, M. R. Turner, S. Connelly, and M. D. Mumford. 2017. “A Meta-Analytic Comparison of Face-to-Face and Online Delivery in Ethics Instruction: The Case for a Hybrid Approach.” Science and Engineering Ethics 23 (6): 1719–1754. https://doi.org/10.1007/s11948-017-9869-3.
  • Tratnik, A., M. Urh, and E. Jereb. 2019. “Student Satisfaction with an Online and a Face-to-Face Business English Course in a Higher Education Context.” Innovations in Education and Teaching International 56 (1): 36–45. https://doi.org/10.1080/14703297.2017.1374875.
  • Watts, L. L., K. E. Medeiros, T. J. Mulhearn, L. M. Steele, S. Connelly, and M. D. Mumford. 2017. “Are Ethics Training Programs Improving? A Meta-Analytic Review of Past and Present Ethics Instruction in the Sciences.” Ethics & Behavior 27 (5): 351–384. https://doi.org/10.1080/10508422.2016.1182025.
  • Wei, H.-C., H. Peng, and C. Chou. 2015. “Can More Interactivity Improve Learning Achievement in an Online Course? Effects of College students’ Perception and Actual Use of a Course-Management System on Their Learning Achievement.” Computers & Education 83:10–21. https://doi.org/10.1016/j.compedu.2014.12.013.
  • Zigmond, M. J., and B. A. Fischer. 2014. “Teaching Responsible Conduct Responsibly.” Journal of Microbiology & Biology Education 15 (2): 83–87. https://doi.org/10.1128/jmbe.v15i2.874.