925
Views
0
CrossRef citations to date
0
Altmetric
Articles

Application of gamified virtual laboratories as a preparation tool for civil engineering students

ORCID Icon, ORCID Icon, &
Pages 164-191 | Received 10 May 2022, Accepted 26 Sep 2023, Published online: 29 Sep 2023

ABSTRACT

Practical laboratory sessions are essential for engineering education, demanding efficient use of limited time. In recent years, Virtual Reality (VR) technologies have introduced Virtual Laboratories (VLs), offering the potential to enhance students’ educational experience. Despite their potential, VLs are rarely utilised in civil engineering education. This research investigates the effectiveness of a gamified VL designed to simulate a concrete laboratory, aimed at better preparing students for experiments. A quasi-experimental study divided 92 students into control and experimental groups using cluster sampling. The control group received traditional lab training, while the experimental group engaged with the VL training environment. The results demonstrate that students using the VL spent significantly less time in the physical lab, exhibited improved competence in navigating lab setups, posed fewer questions about experimental procedures, and required less assistance from lab assistants. Notably, VL users spent 16% less time in the physical lab and needed fewer interventions from lab assistants. This study highlights the potential of VLs as potent tools for preparing engineering students for traditional lab sessions. Post-experiment surveys revealed a strong willingness among students in the experimental group to use VLs in future similar lab sessions, emphasising the positive impact of integrating VLs into engineering education.

1. Introduction

Given the very applied nature of engineering disciplines, practical lab sessions are an integral part of engineering education worldwide (Alkhaldi, Pranata, and Athauda Citation2016; Allen and Barker Citation2020; Hensen and Barbera Citation2019; Krontiris Citation2021; Stuchlikova et al. Citation2017). In civil engineering, too, practical lab sessions are deemed essential to familiarise students with the practical use of theories and ultimately prepare them for the job market (Guerrero-Mosquera et al. Citation2018). However, lab sessions inherently demand a high degree of engagement, alertness, and preparation from the students, mainly due to health and safety concerns (Zhang et al. Citation2020). Additionally, lab sessions are logistically difficult and expensive to run and they are usually restrictive both in terms of time and space (Abdulwahed and Nagy Citation2014; Achuthan and Murali Citation2015). Therefore, it is important that students make the best use of their limited time in the labs. Pre-lab preparation has a significant impact on ensuring that lab time is used efficiently. However, conventional lab preparation strategies, which are commonly based on the use of instruction manuals, can be ineffective (Abdulwahed and Nagy Citation2014; McAfee, Armstrong, and Cunningham Citation2009). The development of Virtual Reality (VR) technologies in recent years has offered new possibilities to address this limitation through the use of more engaging lab preparation methods. To this end, the present study aims to develop a gamified Virtual Laboratory (VL) training tool that can be used to help prepare students to execute a lab experiment more effectively and efficiently.

1.1. Traditional and virtual lab sessions

Conventionally, engineering lab sessions include, at the very least, three different phases, namely: (1) briefing/preparation, (2) execution, and (3) analysis and reporting (Karakasidis Citation2013). During the preparation phase, students are familiarised with the experiment (in terms of required instruments and procedure) either through lectures by the teachers, pre-recorded instructional videos, or written guidelines. During the execution, students, individually or in groups, go to the lab and either carry out the experiment themselves with supervision or observe the experiment carried out by a technician. Finally, students take the experiment data, analyze them and prepare a report for submission and subsequent assessment. The smooth execution of the engineering lab depends, to a great extent, on good preparation by both instructors and students. The more familiar students are with the lab process and safety rules and the better these are communicated to them beforehand, the better they can use their time in the lab (Abdulwahed and Nagy Citation2014). Therefore, this study focuses on how the briefing/preparation phase influences performance during the execution phase.

The transition to online education during COVID-19 has unmasked some of the limitations of conventional practices in the delivery of engineering lab sessions. While theoretical lectures could be delivered online with relative ease, the lab sessions proved to be very difficult, if not impossible, to be retained in the engineering curriculum (Gamage et al. Citation2020). This is mainly due to the high dependency of lab sessions on the physical instruments/space and the absence of alternative technological media that offer similar, or moderately comparable, opportunities for interacting with the physical objects, equipment, tools, and lab procedures (Gamage et al. Citation2020). This is especially a deterrent to the acquisition of psychomotor skills (Chiew, Bidaun, and Sipi Citation2021).

Several alternative approaches have been developed over the past two decades to surmount the limitations of physical lab sessions. Generally, two trends can be identified, (1) remote labs and (2) VL (Krontiris Citation2021). The remote lab refers to a setting where students are provided with remote access to the actual physical labs so that students can either observe or lead the experiments remotely while an agent, be it a human agent or a computerised agent, executes the experiment (Alkhaldi, Pranata, and Athauda Citation2016; Potkonjak et al. Citation2016). VL, on the other hand, pertains to cases where VR technologies are used to create a virtual representation of the lab environment. In this case, students are placed or immersed in the virtual world and perform the experiment using simulated processes (Bhute et al. Citation2021).

1.2. Gamified virtual laboratory as a preparation tool

1.2.1. Applicational of virtual laboratory

In recent years, both remote and virtual labs have been extensively investigated in such domains as chemistry, biology, medicine, and engineering (Bennie et al. Citation2019; Darrah et al. Citation2014; Doak et al. Citation2020; Grodotzki, Ortelt, and Tekkaya Citation2018; Li et al. Citation2017; Maloney et al. Citation2012; Sharma Citation2016; Tan et al. Citation2020). Although results are promising, the actual use of VLs remains a rarity in the engineering domain. This is mainly because, in the majority of the cases, remote/virtual labs are developed and proposed for application in a substitutional capacity, i.e. replace the actual physical sessions (e.g. Achuthan and Murali Citation2015), and this poses a number of limitations including, but not limited to, dependencies on the availability of high computing power, the inability of remote/virtual labs to invoke the same degrees of engagement in students, failure to teach the details of operating actual equipment, complexities of developing a realistic and interactive VL, restriction to limited predefined scenarios, high cost of development, and low scalability and extensibility (Aliane, Pastor, and Mariscal Citation2010; Krontiris Citation2021; Potkonjak et al. Citation2016).

1.2.2. Different modes of VL intervention

In view of the above-mentioned limitations, Krontiris (Citation2021) suggested that VR labs are more suitable for simple experiments and entry-level courses because physical sessions are still necessary for complex experiments. Moreover, Ma et al. (Citation2021) discovered that while students appreciated the concept of a VR lab, the majority of them still prefer physical lab sessions because of the need to actually experiment with physical objects and the desire to learn through collaboration and teamwork (Bhute et al. Citation2021). Similar observations were made by Schnieder, Williams, and Ghosh (Citation2022) who reported the preference of students to use VL in conjunction with physical lab sessions. It is argued that the necessity of acquiring psychomotor skills in lab-based education inevitably requires a certain degree of exposure to actual physical labs (Alkhaldi, Pranata, and Athauda Citation2016; Allen and Barker Citation2020; Marques et al. Citation2014). As a complementary tool, however, VL sessions have been demonstrated to be beneficial to the conceptual understanding of secondary vocational engineering students (Kollöffel and de Jong Citation2013). In the more specific context of civil and mechanical engineering, Vergara et al. (Citation2021) observed that out of 410 surveyed students, only 11% considered VLs as self-sufficient. The remaining 89% of students believed VLs need to be complemented by physical labs. In this hybrid model, VLs can be used in a complementary rather than substitutional capacity. For instance, it is proposed that VR can be used before lab sessions as a briefing/preparatory tool (Krontiris Citation2021; Stuchlikova et al. Citation2017). VR as a preparatory tool has been shown to be effective in terms of better familiarising with the experiment procedure (Ullah, Ali, and Rahman Citation2016b), boosting students’ self-confidence for engaging in lab sessions and reducing perceived time spent in the physical lab (Blackburn, Villa-Marcos, and Williams Citation2018), and increasing students’ sense of feeling prepared for their physical lab sessions compared to when using a briefing worksheet (Hatchard et al. Citation2019). Moreover, preparation procedures that include a VL component have been demonstrated to be more beneficial to learning outcomes than preparation procedures that do not include a VL (Abdulwahed and Nagy Citation2014). Thus, using VLs has demonstrable advantages for students’ experiences and learning. However, these training tools were presented more as instructional tools and lacked interactive and gamified components. This can be an oversight considering the significant impact of gamification on improving the reception of technology-enhanced education among students (Rivera and Palmer Garden Citation2021).

1.2.3. Impact of gamification on engineering laboratory

Gamification is the insertion of game-like elements into a non-gaming environment, such as point systems, tasks or quests to complete, or showing progress through the environment (Alptekin and Temmen Citation2020). The interactivity of a training tool, which is an aspect of gamification, fosters the engagement of students with the VL and its learning materials, leading to better reception of the tool (Allen and Barker Citation2020; Vergara, Rubio, and Lorenzo Citation2017). An example of this interactivity is the possibility for the students to experience trial and error in the acquisition of knowledge, where the environment responds to correct and incorrect actions, fostering experience-based learning as described in constructivist theory (Alrehaili and Al Osman Citation2019; Barham, Preston, and Werner Citation2012).

The literature provides strong evidence as to how gamification can enhance both students’ motivation and performance (Bonde et al. Citation2014; Kim, Rothrock, and Freivalds Citation2016; Subhash and Cudney Citation2018). Researchers have also tried to use gamification for practical lab sessions and have observed similar positive effects on students’ performance and engagement (Drace Citation2013). For instance, Pàmies Vilà et al. (Citation2022) applied a gamification strategy to provide feedback to students in Mechanism and Machine Theory lab sessions. The results indicated that students who were subject to the gamified feedback tool had a higher success rate in the lab exam. In another interesting research, Chen (Citation2020) implemented VL-based training for 3D printing machines both with and without gamification. The study demonstrated that VL is much better received by students with gamification elements.

Different gamification elements can be incorporated into VL. Alptekin and Temmen (Citation2020) enumerated a number of these elements, namely narrative, avatar, quest & mission, points, levels, badges, and ranking lists. Of course, a gamified VL does not need to have all these elements, but at least a number of these elements should be present. As explained by Alptekin and Temmen (Citation2020) the most significant element is the narrative of the game, which represents the objective of the game and the explanation of how the game starts and ends. The incorporation of the point mechanism is also a significant element in gamification not only to give students a tangible sense of accomplishment but also to set benchmarks for different skill levels and success/failure thresholds. Evidence regarding the effectiveness of gamification strategies for learning outcomes at various levels of Bloom’s taxonomy demonstrates, among other things, that the narrative and point mechanisms are appropriate and effective gamification strategies for remembering information and learning to apply learned facts (Bedwell et al. Citation2012; Rivera and Palmer Garden Citation2021), which are central to lab preparation.

1.3. Significance of the study

In the specific context of civil engineering, some experiments were carried out on the development and use of VL (e.g. Barham, Preston, and Werner Citation2012; Vergara, Rubio, and Lorenzo Citation2016; Citation2019). However, to the best of the authors’ knowledge, the previous experience with the use of VL in civil engineering was mainly confined to the use of VL in a substitutional capacity and not in a complementary capacity. Therefore, there is limited insight into how VL can be used in a complementary capacity, specifically as a preparation tool for lab sessions. It should be highlighted that when using VL in a complementary capacity, the pedagogical objectives of VL transform from being the same as the expected outcomes of the physical lab to preparing students to do the experiments in the lab more effectively (i.e. fewer ambiguities about the process). This requires the VL performance to be assessed by a different lens. For instance, for VL to prepare students for the actual lab work, the training tool must help students hone their spatial awareness (i.e. better navigation and orientation in the lab space). This would inevitably require the VL to be a realistic representation of the actual layout and design of the physical lab (Schofield and Lester Citation2010). This requirement is less relevant if the VL is to be used in a substitutional capacity. Additionally, while the previous VL platforms in civil engineering incorporated elements of interactivity, i.e. the students can interact with the environment and make choices that impact the behaviour of the simulated virtual world (Hatchard et al. Citation2019), a more elaborate gamification strategy that involves scoring and success/failure mechanisms has seldom been used. Consequently, the main research gap can be formulated as follows: currently, there is little insight into how gamification and VL can be integrated into a preparation tool for engineering lab sessions and how successful the implementation of this tool can be.

1.4. Research question and objectives

The central question that this pilot study aims to address is to what extent a gamified VL can serve as a preparatory tool for enhancing the effectiveness of physical lab sessions. This translates into two aims of the current pilot study: 1) To develop a training environment in the form of a gamified VL as a preparation tool for the engineering lab session, and 2) through the implementation of this training environment in an actual lab session, investigate the usability and effectiveness of the environment as a training tool. More specifically, the investigation of the implementation assessed:

  1. usability of the VL as rated by the students,

  2. perceived effectiveness of the VL as indexed using students’ ratings of how well-prepared they felt for the lab session,

  3. perceived effectiveness of the VL as indexed using a student assistant’s rating of how well the students were prepared for the lab session, and

  4. effectiveness of the VL as indexed using objective measures of performance.

Both the self-ratings of the students and ratings of student assistants who supervise the lab sessions were included in the measurement of VL effectiveness. This step was taken to ensure that if ratings of the effectiveness of preparations for the lab were governed by a fascination with or reliance on technology, this would become visible through a comparison of ratings. It should be highlighted that the findings of this research need to be interpreted with caution as it is still a pilot project and results are only an early indication of potential impact.

2. Materials and methods

represents the overall framework implemented in this pilot project. As shown in this figure, the framework consisted of three main phases, namely conceptualisation, development, and evaluation phases. In short, the conceptualisation phase was about the determination of the objective, narrative, and gamification elements of the VL. In the development phase, the VL training environment was developed. This environment includes a scenario that represents the sequence of activities that needed to be followed in the concrete lab, as will be explained in Section 2.1. In the evaluation phase, the developed environment was implemented in the Construction Materials course offered in the first quarter of the 2020 academic year at the University of Twente, as will be explained in Section 2.2.

Figure 1. Overview of the research methodology.

Figure 1. Overview of the research methodology.

2.1. Conceptualisation phase

This phase was implemented to define the scope of VL training and the gamification strategy. In this first step, the objective of the training was determined by the teacher of the subject and inputs from student assistants, considering the bottlenecks that have been observed in the running of lab sessions in the previous years. It was determined that one of the greatest challenges in the previous years has been the unfamiliarity of the students with the lab setup, the order of lab tasks, and safety/housekeeping rules. Also, student assistants were complaining about the high dependency of students on help/guidance from student assistants. Finally, due to the restrictions set by the university during the Covid time, it was integral that students spent as little time as possible in the lab. Therefore, the main learning objectives of the VL were defined as follows: (1) being able to locate the various elements of the lab setup (i.e. spatial awareness), (2) being able to recall the lab procedure and apply the steps in a VL session (i.e. procedural awareness), (3) being able to interpret and apply safety/housekeeping rules, and (4) reducing the dependency of students on student assistants and reducing the lab time through ensuring that students come to the lab prepared.

With these objectives in mind, three main gamification elements were determined, namely narrative, mission, and points. The narrative of the VL was defined as follows: after informing the students about the mechanism of the game, students need to identify the right procedure of the lab through a set of multiple-choice questions (procedural awareness). After the correct determination of each step, the students should be given safety and housekeeping rules that pertain to that specific task (safety/housekeeping rules). When the task requires the use of specific equipment or relocation to a special part of the lab, the user needs to move to the target location (spatial awareness). Navigation to the right location is the condition for the start of the next task in the process. Also, when students are expected to perform specific tests, which are commonly more complex than other tasks, a visual demonstration should be provided to them (reducing dependencies on student assistants). The game finishes when students have gone through all the tasks (including cleaning the lab). The mission of the game is to finish the game with as few mistakes as possible. To this end, it was decided to incorporate a point system where students get one point for each correct answer and the training is considered complete only when students score more than 80%.

2.2. Development phase

Since the development of spatial awareness was one of the main learning objectives of the VL training, the VL needed to be a realistic 3D representation of the lab. Therefore, the first step of the development phase was scanning and measuring the actual concrete lab at the University of Twente. As suggested by the literature, the graphical accuracy and fidelity of the model play an important role in enhancing the effectiveness of VR-based training environments (Kamińska et al. Citation2019; Potkonjak et al. Citation2016; Vergara, Rubio, and Lorenzo Citation2017). Effectiveness in this context is defined as the extent to which the VL training environment is able to contribute to the better achievement of learning objectives. To achieve this, the researchers performed visual scanning of the site and took measurements of the dimensions of the lab and the location of fixed furniture. Many photos were taken to be used as visual references during the development of the 3D model. Next, the realistic 3D model of the lab was built using Google Sketchup. The Trimble 3D library was used to place realistic representations of fixed furniture and lab equipment. This resulted in the development of an on-scale realistic 3D model, as shown in . Once the model was developed, it was transferred to the Unity 3D game engine for the development of a navigable scene. In Unity 3D, first, a navigation feature was added to the model that would allow users to use their mouse and keyboard to freely move in the VL. The first-person view was used for the navigation and interaction with the model.

Figure 2. 3D model of the concrete lab.

Figure 2. 3D model of the concrete lab.

In the next step, the procedure of the lab work was used to create the training scenario. represents the overview of the scenario implemented in this environment. As shown, first a brief instruction about the use of the environment is presented to the user. Then, the user is asked to provide the name, email address, and student ID, as shown in (a). Given that the performance of the student is communicated with the teacher in real time through an online server, the user must be connected to the internet. In essence, the training environment consists of 36 lab activities that need to be followed in the correct sequence. In each step of the procedure, the user is presented with three options and asked to identify the correct step, as shown in (b). If the provided answer is wrong, the user is asked to try again. When the correct answer is provided, the student is asked to navigate to the pertinent location in the lab where this step takes place, using a keyboard and mouse. It should be noted that the students could not go back to the previous step, therefore they were advised to take notes of the correct answers. Arrows are used to usher the user to the right location, as shown in (c). Once the user gets to the right location, an information box appears that explains why this step is important and what the important safety and housekeeping considerations are in this step, as shown in (d). Similar introductory information is provided to the students in the control group in the guideline, as will be explained in Section 3.2. For 4 of the activities, given the sheer amount of important information that needs to be conveyed to the user, instructional videos are played instead of textual information. In these videos, a lab technician explains the details of how the steps need to be executed in the lab, as shown in (e). For example, the procedure for the compctability testing of concrete is explained through an instructional video.

Figure 3. Scenario of the VL Training.

Figure 3. Scenario of the VL Training.

Figure 4. Screenshots of the developed VL training environment. (a) Introduction; (b) Identification of the next task; (c) navigating to the right location in the lab; (d) presentation of the additional information; (e) embedded instructional video.

Figure 4. Screenshots of the developed VL training environment. (a) Introduction; (b) Identification of the next task; (c) navigating to the right location in the lab; (d) presentation of the additional information; (e) embedded instructional video.

Once the user finishes all the tasks, the overall score is presented, in terms of the percentage of the correct answers that were achieved on the first try. If the score is above 80%, the training is considered successful, the student is informed that he/she can proceed to the actual lab work, and the score and the report of successful completion of the training are sent to the server, which is only accessible by the teacher. Before the actual lab work, the teacher checks the list and informs the student assistants in the lab if the said student is allowed to do the lab work.

2.3. Evaluation phase

presents the overview of the evaluation phase in terms of the procedure and instruments used. This figure is explained in the following paragraphs.

Figure 5. Overview of the evaluation phase in terms of procedure and instruments.

Figure 5. Overview of the evaluation phase in terms of procedure and instruments.

2.3.1. Participants

In total, 92 students participated in this study. Of this sample, 82 students were enrolled in the civil engineering programme while the remaining 10 students were external students (i.e. from other engineering studies) who followed this course as a minor. All non-civil engineering students were placed in the same project groups by the module coordinator and their groups (by random assignment) happened to be assigned to the experimental groups.

81% (i.e. 75 out of 92) of the participants mentioned they had prior experience with lab work (either at the university or high school). Only 1 student mentioned having had prior experience with the concrete lab (i.e. retook this course). The responses of this student, who was in the experimental group, were checked separately and since it did not appear to be an outlier in any of the scores, it was decided to retain the results in the analysis.

2.3.2. Procedure

To evaluate the (perceived) effectiveness of the developed training environment, it was implemented as an integral part of one of the concrete lab sessions of the course Construction Materials. This course is offered to first-year civil engineering students at a technical university in the east of The Netherlands. This quasi-experimental study design was adopted because despite concerns about experimental rigour, the implementation in an existing course may be advantageous to the external validity of the conclusions (Gopalan, Rosinger, and Ahn Citation2020). A nonequivalent group design was employed because the 18 groups were formed independently of this study, even though they were randomly assigned to either control or experimental groups. This approach can be characterised as cluster sampling. The students had to do the lab session in groups of 5, 6, or 7. As shown in , the groups were split into the experimental (N = 50 students, 9 groups) and control (N = 42 students, 9 groups) groups. The assignment to conditions was done randomly.

Students in both conditions were assigned to conduct the same lab session. The main difference between the experimental and control groups was the way in which they were asked to prepare for the lab session. While the groups in the experimental condition (henceforth referred to as ‘experiment groups’) were asked to follow the VL training environment, the groups in the control condition (henceforth ‘control groups’) were only provided with the conventional lab guideline, which explained all the steps and rules in a textual format. The guideline included 4 main sections, namely introduction, instruments & environments, procedure, and analysis. In the introduction section, the background of each test was explained to present the rationale behind the test (this is basically the summary of the theory that was already given in the course lectures to the students). In the instruments section, all the required equipment and environments were presented to the students and graphical aids, i.e. the pictures of the equipment, were also used to help students better identify them in the lab. The procedure section is comprised of a step-by-step description of the lab work. This includes the elaboration of what the students need to do, what instruments to use, and how the instruments need to be set up. Finally, in the analysis section, the necessary procedures to analyze the results of the experiment were outlined. presents an excerpt of the guideline. The use of these graphics can be considered as an effort to provide a comparable opportunity, i.e. compared to the VL training environment, to help students gain insights about the lab setup. Given that it was important that this pilot study does not interfere with the learning objective of the course, i.e. to remain fair to all students, and because the guideline’s sections on the analysis of the results could not be gamified in the VL training environment, the guideline was made available to the experiment groups as well.

Figure 6. An excerpt from the guideline.

Figure 6. An excerpt from the guideline.

As explained in Section 2.1, the students in the experimental groups needed to have successfully completed the VL training (final score > 80%) to be allowed in the lab. However, since the student assistant in the lab was required to be blind about who followed the training environment, a mechanism was set up so that only groups that had a green light from the teacher could proceed to the lab work. While students belonging to the control groups would automatically receive the green light, the scores of the students in the experimental groups were first checked by the teacher, and only if all students of the group were successful in the training, would they receive the green light. It is important to highlight that students in the experimental groups were evaluated on the basis of their performance in the VL, which was not possible for the control groups. This, however, should not be perceived as a bias in the design of the experiment. As mentioned before, mission and points are the two integral elements of gamification (Alptekin and Temmen Citation2020). In the case of this experiment, it was difficult to claim gamification without the incorporation of these elements. Because we are assessing the impact of the integration of (1) gamification, and (2) VL on students’ performance, the contribution of mission and point elements should be kept as a unique component of the experimental group. It is also important to consider the fact that it is through gamification and VL that we could perform pre-lab assessments easily. In the conventional lab training practice, it is uncommon and impractical to incorporate quality control checks before the lab. So, in the interpretation of the results, the differences in the performance of students in the two groups should be interpreted as a combined effect of gamification (points and mission) and engagement in the VL.

Evaluations of the usability of the VL and the effectiveness of preparations were done directly after the lab session by both students and student assistants.

2.3.3. Instrumentation

2.3.3.1. Subjective assessment of usability by students

To assess the usability of the VL, each student taking part in the experimental condition filled out 10 custom-made questions targeting various aspects of usability, e.g.: ‘To what extent do you think the process you followed in the game was consistent with the actual experiment in the lab?’. Questions were rated on a five-point Likert scale ranging from (1) Not at all, to (5) Very high degree. The items were constructed specifically for the current study, but loosely based on previously validated work on the use of technology, being scales for psychological immersion and perceived realism (see: Lipp et al. Citation2021) and a scale for ease of use (Davis Citation1989). A full overview of usability items can be found in .

Table 1. Details of the post-experiment questionnaire filled by students.

2.3.3.2. Subjective assessment of effectiveness by students

To rate the perceived effectiveness of the preparation, the students (both experimental and control group) were asked to fill in a custom-made questionnaire (i.e. contribution to making the lab session as clear and smooth as possible) about the material they used for the preparation of the lab work. The questionnaire was designed to specifically assess the objectives of the tool, which were explained in Section 2.1. Ratings of the materials were made on a five-point Likert-type scale. The scale was loosely based on validated scales for measuring the subjective effectiveness of a technology, namely perceived usefulness (Davis Citation1989) and performance expectancy (Venkatesh et al. Citation2003) and behavioural intention (Davis Citation1989; Venkatesh et al. Citation2003). presents the overview of the post-experiment questions.

2.3.3.3. Subjective and objective assessment of effectiveness by student assistants

The student assistants were asked to keep track of (1) experiment duration, (2) the number of questions asked by the students of each group (regardless of the nature and subject of the question), and (3) the number of times an intervention was deemed necessary. To rate the effectiveness of the preparation tool, they were also asked to reflect on the performance of each group (i.e. subjectively) by filling out a separate questionnaire, which is shown in . As shown in this table, some of the assessment criteria are common between the student assistants and students. Moreover, ratings were given on the same five-point Likert scale as the one used by the students. This was implemented to be able to compare the performances from both the external assessor and self-reflection perspectives.

Table 2. Details of the post-experiment questionnaire filled by teacher assistants.

2.3.4. Data analysis

After data collection, data were transferred to an Excel file and checked for accuracy. Then, explorations were conducted about frequencies of reported procedural aspects such as repetitions of the preparatory activities, and correlation analyses were performed to index the correlation between these events and the effectiveness of the preparation.

Subsequently, analyses were performed to address the various research questions. The usability of the VL was mapped by computing and comparing the means of each of the Likert-scale items targeting usability. The effectiveness scores were analyzed in several ways. First, differences between experimental and control groups in terms of students’ subjective ratings of their own performance were tested using Mann – Whitney U-Tests. Then, the student assistants’ ratings of the performance of groups were compared between the experimental and control groups using either t-test for independent means, or Mann – Whitney U-Tests, depending on whether distributions of variables approached normality. Moreover, to assess whether there was an agreement between students and student assistants in terms of performance, their ratings of performance were compared using Mann – Whitney U-Tests. Finally, to address the differences between experimental and control groups in terms of objective performance measures, t-tests for independent means were conducted, comparing the performance of the experimental and control groups.

3. Results

3.1. Descriptive analyses and correlations

Of the 50 students in the experimental groups, 30% mentioned that they managed to pass the VL training on the first try. The remaining 70% mentioned they had to redo the training. No participant had to do the training more than 2 times. In the control groups, 36% of the students read the guideline only once, 52% read it twice, 7% read it 3 times, and only 5% mentioned that they had to read the guideline 4 times or more. As mentioned in Section 2.3.2, students in experimental groups had access to the guideline. To investigate whether or not the students from the experimental group who read the guideline before doing the VL performed better in VL (i.e. passing the VL passing thresholds on the first try), the statistical correlation analysis was made. 56% of students in the experimental group did not read the guideline. Of those who did not read the guideline, 67% had to do the training twice. Among those who read the guideline, 72% had to do the training twice. Both the Pearson Correlation Coefficient of the two variables (r = −0.05) and the chi-square test (χ2 = 0.14; p = 0.709) suggest that reading the guideline had no significant correlation with how many times the training needed to be repeated. This suggests that the content of VL has little overlap with the guideline, since reading the guideline did not result in a noticeable improvement in VL performance.

70% of the experimental groups did not talk to their teammates about the training beforehand. Of those who did not talk about the training with their peers, 72% had to do the training twice. Out of 30% of students who did talk to their peers about the experiment, 66% had to do the experiment twice. Again, both the Pearson Correlation Coefficient (r = 0.05) and chi-square test (χ2 = 0.11; p = 0.736) suggest that there is no significant relationship between the number of times the training was done and the peer discussion. So, it can be concluded that the performance in the VL training is not affected by reading the guideline or by talking to peers about it.

3.2. Usability assessment

represents the assessment of the VL training environment mainly from the usability perspective. 10 items introduced in Section 2.3 were used to assess the environment. As shown in , the environment scored above average on all items.

Figure 7. Usability assessment of the VL training environment.

Figure 7. Usability assessment of the VL training environment.

The most appreciated aspects of the environment were the ease of use, comprehensibility, and control, in that order. This indicates that, essentially, students faced no particular challenges in the comprehension and use of the environment. The lowest-scored aspect of the environment was procedural realism (average score of 3.54/5), which means that some students have experienced that they need to do the tasks in a slightly different order as prescribed in the environment. This has happened because sometimes student assistants mistakenly deviated from the prescribed procedure. This indicates the importance of a full alignment between the training environment and supervisory staff in the lab. The physics of the environment scored the second least (average score of 3.6/5). It is, as mentioned by students in the open comments, mainly because some of the objects in the VR scene did not have colliders (due to design error) and therefore the user could pass through them. While the model accuracy scored high (average score of 3.96/5), the graphical realism of the VR scene scored slightly lower (average score of 3.82/5). However, given the purpose of the training environment, it was important for all students to be able to install and use the environment smoothly without imposing high computer graphics requirements. In light of this compromise, in general, the graphical fidelity of the model seems to have been acceptable. Other interface-related aspects of the environment (i.e. navigation and control) were scored high and appreciated.

3.3. Effectiveness assessment

3.3.1. Subjective assessment of lab performance by students

The next aspect of the environment assessed in this study was the perceived effectiveness. Questions answered by students can be found in . These questions aimed to assess the training material's impact on enhancing spatial awareness, procedural awareness, health and safety awareness, housekeeping, preparation, independence, and overall task readiness.

represents the results of the perceived effectiveness assessment by students (i.e. self-reflection). In this figure, colour-coded columns show the distributions of scores in each category (e.g. 32 students scored spatial awareness of VL training very high). In the same figure, the solid lines show the average score in each category, where red represents VL training and yellow represents guideline-based preparation. For instance, the average score of spatial awareness is 4.2 and 3.3 for VL and guideline-based training, respectively. As shown in this figure, the VL-based training scored higher than that of guideline-based on all items but one (i.e. preparation). However, from the student's perspective, the impact of both training materials on giving students an idea about what to prepare for the lab is the same. To test whether the perceived effectiveness of the VL and the textual guidelines were rated differently, statistical analyses were performed, as shown in . To test the significance of variance between the two groups, first, the normality and variance homogeneity of data were tested using the Kolmogorov – Smirnov Test of Normality and Levene's Test of Variance Homogeneity, respectively. For data that met both conditions, independent samples t-tests were used to assess group differences. For data that were neither normally distributed nor have homogenous variance, Mann – Whitney U-Tests were used to test for group differences. Because the primary aim of this pilot study was to see whether or not VL improves performance, the significance was only investigated in one tail. As shown in , the data representing the subjective assessment by students do not meet the criteria for the T-test. However, the Mann – Whitney test suggests that in 6 out of 9 criteria, the variance between the scores of the two groups is significant. From the students’ perspective, however, the training environment performed considerably better in terms of (1) honing spatial awareness, (2) procedural awareness, (3) health and safety, (4) housekeeping, (5) time-saving, and (6) overall task-readiness. It means that from a statistical standpoint, there is not a considerable difference in how the VL training environment helped students (1) make the necessary preparation for the lab work, (2) do the lab sessions independently, and (3) become interested in reusing similar type of training materials.

Figure 8. Comparative assessment of the effectiveness of the training based on a qualitative assessment by students.

Figure 8. Comparative assessment of the effectiveness of the training based on a qualitative assessment by students.

Table 3. Analysis of the effectiveness based on a subjective assessment by students.

3.3.2. Subjective assessment of lab performance by student assistants

presents the results of the subjective assessment performed by student assistants. This assessment was merely based on the observation of students’ performance and not through direct questions or measurements. As shown in this figure, student assistants (who were not aware of what training material each group of students used for preparation) consistently scored experiment groups higher on all items. Again, the analysis of the significance of variance, which is shown in , suggests that the difference between the scores of the two groups is significant in all cases except health and safety. The same method explained in was used to investigate significant variations. From the student assistants’ perspective, the largest differences were observed in (1) procedural awareness, (2) housekeeping, and (3) spatial awareness. When student assistants were asked to speculate whether or not the group used the VL training environment, they were able to correctly answer this question in 89% of the cases (16 correct guesses, one false positive, and one false negative). In an informal discussion with the researchers after the whole experiment, student assistants pointed out that students who used the VL training games needed very little guidance in terms of where to find different resources (i.e. ingredients or equipment) and how to organise themselves in the overall procedures; so much so that student assistants could easily deduce which groups did the VL training.

Figure 9. Comparative assessment of the effectiveness of the training based on a qualitative assessment by teacher assistants.

Figure 9. Comparative assessment of the effectiveness of the training based on a qualitative assessment by teacher assistants.

Table 4. Analysis of the effectiveness based on a subjective assessment by teacher assistants.

The subjective assessments by students and student assistants are compared to one another, as shown in and . Due to deviations from normality in student data, Mann – Whitney Tests were used with two-tailed significance testing. As shown in , the scores given by students and student assistants are statistically similar (or comparable) for all cases except the procedural awareness and overall task-readiness of control groups. This means that students who followed the guideline-based training overestimated the extent to which they could recall the lab process and also in general ready for the experiment in the lab. An interesting observation is that, in general, the experimental groups had the tendency to underestimate (albeit to a small degree) the contribution of the training environment to their performance, while the control group had the tendency to slightly over-estimate.

Figure 10. Comparison of scores by students with teacher assistants for (a) experiment groups, and (b) control group.

Figure 10. Comparison of scores by students with teacher assistants for (a) experiment groups, and (b) control group.

Table 5. Side-by-side comparison of subjective scores by students (i.e. self-reflection) and teacher assistants (external assessment).

3.3.3. Objective assessment of lab performance by student assistants

Finally, the results of the objective assessment of the groups’ performances are presented in . As mentioned in Section 2.2, the objective effectiveness of the environment is defined in terms of the extent to which it contributes to making the lab session a smoother process. This is measured, quantitatively, in terms of the time students spent in the lab, the number of questions they asked in the lab, and the number of times the intervention, i.e. by the student assistants, was required.

Table 6. Analysis of the effectiveness based on an objective assessment by teacher assistants.

As shown, the experimental groups have outperformed the control groups in terms of the duration of the lab session. Experimental groups spent on average 25 min (approximately 16%) less time in the physical lab sessions than the control groups. An independent samples t-test demonstrated that this difference was significant at a 95% confidence level, and had a large effect size, Cohen’s d = 0.92. Regarding the two other metrics, although experimental groups still outperformed control groups, the difference is not statistically significant.

4. Discussion and conclusions

This pilot study investigated the impact of using VL as a training environment in the concrete mix design lab of the civil engineering programme. To this end, first, a realistic VL was developed to represent the actual layout of the concrete mix lab. Then, a gamification strategy was implemented in the VL to incorporate an element of narrative, narrative, mission, and points. To assess the usability and effectiveness of the environment, the students were split into experimental and control groups, where experimental groups used the VL training environment, and the control group used the traditional guideline. The post-experiment survey was conducted targeting both students and student assistants. The survey focused on the assessment of the usability and effectiveness of the VL training environment.

4.1. Usability of the gamified VL

The usability of the VL was assessed using a questionnaire filled out by students taking part in the experimental group. Ratings of the various usability aspects revealed that overall, engaging in the VL yielded few problems. Investigating usability is of crucial importance when it comes to the implementation of technological tools in education, as it is a prerequisite for effective learning (Asarbakhsh and Sandars Citation2013). From the perspective of the integration of VL training into the lab work, the use of the training environment was very seamless, and very few complaints were reported about the use of the environment. The only complaint issued by students was that they could pass through some objects in the VL. This may have interfered with their sense of presence, which refers to the experience of being present in the virtual environment and may enhance learning performance (Grassini, Laumann, and Rasmussen Skogstad Citation2020). Otherwise, the students found the environment very easy to use. Moreover, the method with which the environment was implemented in the flow of education (i.e. registering the scores and making sure everyone passed the training before the actual lab work) appeared to be very organic and easy to follow. Contrary to the expectation of the authors, none of the students in the experimental groups found the VL training long or the fact that they needed to do it more than 1 time frustrating. This is evident by the strong willingness of the students (78% showed very high or high willingness) to reuse the environment in the other lab sessions.

In general, the tool achieved high scores in all usability assessment criteria. This indicates the great potential of the tool for rendering the lab experience of students and student assistants more pleasant and playful (Asarbakhsh and Sandars Citation2013). Of course, certain design aspects of the tool can be enhanced in view of the received feedback, this includes more realistic physics and a better representation of the lab. A very important procedural observation in the process was the observation by students that student assistants occasionally deviated from their instructions from the prescribed procedures in the VL. While this is a minor flaw in the execution of the lab sessions, the fact that students who took part in the VL sessions noticed this can be seen as evidence of the effective retention of lab procedures. As suggested by other researchers, this can be ascribed to students’ virtual experiences (Hatchard et al. Citation2019).

4.2. Effectiveness of the gamified VL

The effectiveness of the gamified VL was investigated from three different perspectives: 1) objective measures of student performance by the student assistant, 2) subjective self-ratings of students, and 3) subjective ratings of the student’s performance by the student assistant.

In terms of objective measures of performance, the VL training environment contributed to reducing the time students spent in the lab by 16%, confirming a previous finding that VL sessions reduce the perceived time spent in the lab (Blackburn, Villa-Marcos, and Williams Citation2018). While a general variation in the time spent in the lab can be attributed to normal differences in the students’ personal time needs and background, it is shown that there is a statistically significant difference regarding time spent in the lab between the experimental and control group, with the experimental group spending notably less time in the lab than the control group. This difference can be attributed to the effect of gamified VL on students’ preparedness for lab work. This observation is a good indicator of the success of this tool, mainly because one of the main objectives of developing this tool was to reduce students’ time in the lab (mainly in the wake of Covid).

The benefits of the current VL training, as indexed by subjective ratings of performance by both the students and student assistants, align with previous studies on similar training materials (Hatchard et al. Citation2019; Ullah, Ali, and Rahman Citation2016a). The developed VL environment outperformed the conventional guideline-based training in the majority of the assessed criteria. The significant difference in the overall task-readiness assessment done by student assistants between the experimental and control groups and also the high accuracy of prediction about who used the VL training suggests that the VL training has a tangible impact on the performance of the students in the lab.

It can, therefore, be concluded that the use of VL in a complementary capacity and as a training environment is very effective in enhancing students’ performance in the lab and reducing the time in the lab. The Gamification strategy implemented in the VL training proved to be successful in ensuring that students gain enough insights about the lab experiment without feeling frustrated. This suggests that the environment was engaging enough to foster student performance (Allen and Barker Citation2020; Vergara, Rubio, and Lorenzo Citation2017). On this premise, the main contribution of this pilot study can be summarised as presenting a thorough investigation of how VL can be used as a training environment to improve engineering lab education, especially in the context of civil engineering. As shown in the introduction, this addresses a gap in the literature where many researchers have concluded that VL is best used in a complementary capacity, but very little research was done on the integration of VL and gamification and its impact on engineering lab sessions.

4.3. Reflection, limitations, and future work

It should be highlighted that the results of this pilot study should be perceived as an early indication of the potential. The study aimed to develop and test a gamified VL as a complementary preparation tool. To further investigate the impact of such a tool on students’ performance, a more rigorous experimentation setup can be useful in the future. Rigour can be improved in two ways, the first of which is the instrumentation. In the current study, usability and effectiveness were rated with custom-made questionnaires that included one item per aspect of usability and effectiveness. To fortify the conclusions, these aspects should all be measured with various items, and the questionnaire should preferably be validated. Second, this study only considered the two extreme scenarios of using either (a) the preparation tool (both gamification and VL) or (b) the guideline. In line with conventional practices of engineering lab sessions, no pretest was conducted in the control group. As a result, the effects of the current study need to be interpreted with some caution, as it is not certain that random assignment to conditions leads to equally performing groups prior to the experiment. Moreover, the fact that the gamification strategy in VL required students to ‘pass’ the training, could have created a situation when prepared students were potentially compared with non-prepared students. On one hand, this indicated that the tool (and the gamification strategy therein) was successful in pushing the students to prepare for the lab without overwhelming them (78% willingness to use the tool). On the other hand, it is difficult to determine whether this can be attributed to the VL or the gamification strategy. In more rigorous testing, one experimental group can be created to only use the gamification in a non-VL environment (e.g. a written exam) and one group to only use the VL with no gamification. This could make it easier to better determine the root cause of changes in the performance of students. This is one of the future works of the authors. The authors will use the same course next year to perform this case study.

Additionally, the current study adopted two elements of gamification, namely the insertion of a narrative and awarding points to assess performance. Although there is no question about whether gamification was present in the current study, the strategies for gamification were limited and can be expanded upon in future work by, for example, adding a visualisation of progress towards completion of the game through badges or awards (Bedwell et al. Citation2012) and social competition through leaderboards or halls of fame (Joshi, Citation2022). Effects of individual gamification strategies as well as combinations of strategies need further empirical exploration.

There were a few practical learning points that emerged from this pilot project; (1) The effectiveness of the VL environment as perceived by students seems to be tightly associated with the realism of the virtual environment. This aligns very well with the aim of this VL to teach students about the physical layout of the lab. Nevertheless, the layout of the physical lab can change from one year to another, creating incremental dissimilitude between the physical and VLs. It is, therefore, worthwhile to invest in the development of a customisable VL that would allow teachers to easily modify the lab layout at the beginning of the course each year. Moreover, investing in realism may indeed be fruitful up to a certain point, but the preference for realism that prevails across literature targeting VR for learning may partially be based on misconceptions (Skulmowski et al. Citation2021) and should therefore be treated with some caution; (2) Based on the feedback from the student assistants, while VL has had a palpable positive impact on student's performance in the lab, it seems to have caused a degree of rigidity in students’ perceptions of lab procedures. While from the safety perspective, this is not necessarily a negative point, this might be construed as being too prescriptive rather than educative. In other words, to further develop the educational function of the VL, perhaps teachers need to think of strategies to incorporate the raison d'etre of instructions in the VL. This could help students master the core requirements of lab operation and construct situation-sensitive understandings of the operations, and gradually transition from followers to interpreters. Thus, they become conscious of lab rules and can organically apply them in new cases using core knowledge of the lab aims flexibly (Spiro et al., Citation1991); and (3) The very nature of the VL developed in this pilot study was a single-player game. This is in contrast with the majority of engineering lab sessions where experiments are done in groups. While the single-player setup ensures that all students are trained in all aspects of the lab procedure, it ignores the collaborative nature of the work. The possibility of implementing a multi-player setup in VL can be explored in the future.

Future studies may address the effects of textual and VL briefing on lab performance and disentangle between the various aspects of the VL that may foster student performance to find out what accelerates their learning. In particular, studies may address the extent to which the use of technology in itself accelerates performance and disentangle these effects from novelty effects, the effects of gamification strategies, and the effects of feedback and/or repetition on performance. This may help streamline the design of VLs to optimise their benefits for learning.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Faridaddin Vahdatikhaki

Faridaddin Vahdatikhaki is an assistant professor at the University of Twente’s Faculty of Engineering Technology, who researches Infrastructure and Building Information Modeling, simulation and visualisation of construction processes, sensor and image-based tracking of construction equipment, virtual reality and serious gaming applications in construction industry, and near real-time systems for project monitoring and control.

Ilona Friso-van den Bos

Ilona Friso-van den Bos is an assistant professor at the University of Twente’s Faculty of Behavioural and Management Sciences with research interests targeting how educational technology such as VR is used by and suitable for learners with various cognitive profiles

Sajad Mowlaei

Sajad Mowlaei is a Video Game/VR/AR Developer with extensive experience in the development and improvement of gamified VR applications for entertainment and education.

Bas Kollöffel

Bas Kollöffel is an assistant professor at the University of Twente’s Faculty of Behavioural and Management Sciences with research interests targeting the use of technology-based, immersive training environments for professional and vocational training and education. Examples of such technologies are Virtual Reality (VR), Augmented Reality (AR), GoPro action cameras, simulators, games and online learning environments (including MOOC's and SPOC's).

References

  • Abdulwahed, M., and Z. K. Nagy. 2014. “The Impact of Different Preparation Modes on Enhancing the Undergraduate Process Control Engineering Laboratory: A Comparative Study.” Computer Applications in Engineering Education 22 (1): 110–119. https://doi.org/10.1002/CAE.20536.
  • Achuthan, K., and S. S. Murali. 2015. “A Comparative Study of Educational Laboratories from Cost & Learning Effectiveness Perspective.” Advances in Intelligent Systems and Computing 349: 143–153. https://doi.org/10.1007/978-3-319-18473-9_15.
  • Aliane, N., R. Pastor, and G. Mariscal. 2010. “Limitations of Remote Laboratories in Control Engineering Education.” International Journal of Online Engineering 6 (1): 31–33. https://doi.org/10.3991/ijoe.v6i1.1131.
  • Alkhaldi, T., I. Pranata, and R. I. Athauda. 2016. “A Review of Contemporary Virtual and Remote Laboratory Implementations: Observations and Findings.” Journal of Computers in Education 3 (3): 329–351. https://doi.org/10.1007/S40692-016-0068-Z.
  • Allen, T. E., and S. D. Barker. 2020. “BME Labs in the Era of COVID-19: Transitioning a Hands-on Integrative Lab Experience to Remote Instruction Using Gamified Lab Simulations.” Biomedical Engineering Education 1 (1): 99–104. https://doi.org/10.1007/S43683-020-00015-Y.
  • Alptekin, M., and K. Temmen. 2020. “Gamification in an Augmented Reality Based Virtual Preparation Laboratory Training.” Advances in Intelligent Systems and Computing 916: 567–578. https://doi.org/10.1007/978-3-030-11932-4_54/TABLES/1.
  • Alrehaili, E. A., and H. Al Osman. 2019. “A Virtual Reality Role-Playing Serious Game for Experiential Learning.” Interactive Learning Environments, 1–14. https://doi.org/10.1080/10494820.2019.1703008.
  • Asarbakhsh, M., and J. Sandars. 2013. “E-Learning: The Essential Usability Perspective.” The Clinical Teacher 10 (1): 47–50. https://doi.org/10.1111/j.1743-498X.2012.00627.x.
  • Barham, W., J. Preston, and J. Werner. 2012. “Using a Virtual Gaming Environment in Strength of Materials Laboratory.” Congress on Computing in Civil Engineering, Proceedings, 105–112. https://doi.org/10.1061/9780784412343.0014.
  • Bedwell, W. L., D. Pavlas, K. Heyne, E. H. Lazzara, and E. Salas. 2012. “Toward a Taxonomy Linking Game Attributes to Learning: An Empirical Study.” Simulation and Gaming 43 (6): 729–760. https://doi.org/10.1177/1046878112439444.
  • Bennie, S. J., K. E. Ranaghan, H. Deeks, H. E. Goldsmith, M. B. O’Connor, A. J. Mulholland, and D. R. Glowacki. 2019. “Teaching Enzyme Catalysis Using Interactive Molecular Dynamics in Virtual Reality.” Journal of Chemical Education 96 (11): 2488–2496. https://doi.org/10.1021/ACS.JCHEMED.9B00181.
  • Bhute, V. J., P. Inguva, U. Shah, and C. Brechtelsbauer. 2021. “Transforming Traditional Teaching Laboratories for Effective Remote Delivery—A Review.” Education for Chemical Engineers 35: 96–104. https://doi.org/10.1016/J.ECE.2021.01.008.
  • Blackburn, R. A., B. Villa-Marcos, and D. P. Williams. 2018. “Preparing Students for Practical Sessions Using Laboratory Simulation Software.” Journal of Chemistry Education 96 (1): 153–158. https://doi.org/10.1021/acs.jchemed.8b00549.
  • Bonde, M. T., G. Makransky, J. Wandall, M. V. Larsen, M. Morsing, H. Jarmer, and M. O. A. Sommer. 2014. “Improving Biotech Education Through Gamified Laboratory Simulations.” Nature Biotechnology 32 (7): 694–697. https://doi.org/10.1038/nbt.2955.
  • Chen, P. H. 2020. “The Design of Applying Gamification in an Immersive Virtual Reality Virtual Laboratory for Powder-Bed Binder Jetting 3DP Training.” Education Sciences 10 (7): 172. https://doi.org/10.3390/EDUCSCI10070172.
  • Chiew, F. H., B. C. Bidaun, and R. T. J. Sipi. 2021. “Assessing Psychomotor Domain in Civil Engineering Design Project During Pandemic.” International Journal of Service Management and Sustainability 6 (2): 77–97. https://doi.org/10.24191/ijsms.v6i2.15573.
  • Darrah, M., R. Humbert, J. Finstein, M. Simon, and J. Hopkins. 2014. “Are Virtual Labs as Effective as Hands-on Labs for Undergraduate Physics? A Comparative Study at Two Major Universities.” Journal of Science Education and Technology 23 (6): 803–814. https://doi.org/10.1007/S10956-014-9513-9.
  • Davis, F. D. 1989. “Perceived Usefulness, Perceived Ease of use, and User Acceptance of Information Technology.” MIS Quarterly, 319–340. https://doi.org/10.2307/249008.
  • Doak, D. G., G. S. Denyer, J. A. Gerrard, J. P. Mackay, and J. R. Allison. 2020. “Peppy: A Virtual Reality Environment for Exploring the Principles of Polypeptide Structure.” Protein Science 29 (1): 157–168. https://doi.org/10.1002/PRO.3752.
  • Drace, K. 2013. “Gamification of the Laboratory Experience to Encourage Student Engagement.” Journal of Microbiology & Biology Education : JMBE 14 (2): 273. https://doi.org/10.1128/JMBE.V14I2.632.
  • Gamage, K. A. A., D. I. Wijesuriya, S. Y. Ekanayake, A. E. W. Rennie, C. G. Lambert, and N. Gunawardhana. 2020. “Online Delivery of Teaching and Laboratory Practices: Continuity of University Programmes During COVID-19 Pandemic.” Education Sciences 10 (10): 291. https://doi.org/10.3390/EDUCSCI10100291.
  • Gopalan, M., K. Rosinger, and J. B. Ahn. 2020. “Use of Quasi-Experimental Research Designs in Education Research: Growth, Promise, and Challenges.” Review of Research in Education 44 (1): 218–243. https://doi.org/10.3102/0091732X20903302.
  • Grassini, S., K. Laumann, and M. Rasmussen Skogstad. 2020. “The Use of Virtual Reality Alone Does Not Promote Training Performance (but Sense of Presence Does).” Frontiers in Psychology 11), https://doi.org/10.3389/fpsyg.2020.01743.
  • Grodotzki, J., T. R. Ortelt, and A. E. Tekkaya. 2018. “Remote and Virtual Labs for Engineering Education 4.0: Achievements of the ELLI Project at the TU Dortmund University.” Procedia Manufacturing 26: 1349–1360. https://doi.org/10.1016/J.PROMFG.2018.07.126.
  • Guerrero-Mosquera, L. F., D. Gómez, P. Thomson, L. F. Guerrero-Mosquera, D. Gómez, and P. Thomson. 2018. “Development of a Virtual Earthquake Engineering lab and its Impact on Education.” DYNA 85 (204): 9–17. https://doi.org/10.15446/DYNA.V85N204.66957.
  • Hatchard, T., F. Azmat, M. Al-Amin, Z. Rihawi, B. Ahmed, and A. Alsebae. 2019. “Examining Student Response to Virtual Reality in Education and Training.” IEEE International Conference on Industrial Informatics (INDIN), 2019-July, 1145–1149. https://doi.org/10.1109/INDIN41052.2019.8972023.
  • Hensen, C., and J. Barbera. 2019. “Assessing Affective Differences between a Virtual General Chemistry Experiment and a Similar Hands-On Experiment.” Journal of Chemical Education 96 (10): 2097–2108. https://doi.org/10.1021/ACS.JCHEMED.9B00561.
  • Joshi, A. 2022, September. “Supporting Student Motivation through Social Comparison.” In Conference on Technology Enhanced Learning. https://ceur-ws.org/Vol-3292/DCECTEL2022_paper03.pdf.
  • Kamińska, D., T. Sapiński, S. Wiak, T. Tikk, R. E. Haamer, E. Avots, A. Helmi, C. Ozcinar, and G. Anbarjafari. 2019. “Virtual Reality and Its Applications in Education: Survey.” Information 10 (10): 318. https://doi.org/10.3390/INFO10100318.
  • Karakasidis, T. 2013. “Virtual and Remote Labs in Higher Education Distance Learning of Physical and Engineering Sciences.” IEEE Global Engineering Education Conference, EDUCON, 798–807. https://doi.org/10.1109/EDUCON.2013.6530198.
  • Kim, E., L. Rothrock, and A. Freivalds. 2016. “The Effects of Gamification on Engineering Lab Activities.” Proceedings - Frontiers in Education Conference, FIE, 2016-November. https://doi.org/10.1109/FIE.2016.7757442.
  • Kollöffel, B., and T. de Jong. 2013. “Conceptual Understanding of Electrical Circuits in Secondary Vocational Engineering Education: Combining Traditional Instruction with Inquiry Learning in a Virtual lab.” Journal of Engineering Education 102 (3): 375–393. https://doi.org/10.1002/jee.20022.
  • Krontiris, A. 2021. Virtual Labs – challenges, Opportunities and Practical Lessons Learned. 1–4. https://doi.org/10.1109/MPS52805.2021.9492610.
  • Li, Y., D. Zhang, H. Guo, and J. Shen. 2017. “A Novel Virtual Simulation Teaching System for Numerically Controlled Machining.” International Journal of Mechanical Engineering Education 46 (1): 64–82. https://doi.org/10.1177/0306419017715426.
  • Lipp, N., R. Sterna, N. Dużmańska-Misiarczyk, A. Strojny, S. Poeschl-Guenther, and P. Strojny. 2021. “VR Realism Scale - Revalidation of Contemporary VR Headsets on a Polish Sample.” PloS ONE 16 (12): e0261507. https://doi.org/10.1371/journal.pone.0261507.
  • Ma, G. G., J. P. Voccio, D. E. Perkins, and T. Greene. 2021. Introduction to Engineering Virtual Labs - Challenges and Improvements.
  • Maloney, S., M. Storr, S. Paynter, P. Morgan, and D. Ilic. 2012. “Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods.” Advances in Health Sciences Education 18 (1): 71–80. https://doi.org/10.1007/S10459-012-9355-2.
  • Marques, M. A., M. C. Viegas, M. C. Costa-Lobo, A. V. Fidalgo, G. R. Alves, J. S. Rocha, and I. Gustavsson. 2014. “How Remote Labs Impact on Course Outcomes: Various Practices Using VISIR.” IEEE Transactions on Education 57 (3): 151–159. https://doi.org/10.1109/TE.2013.2284156.
  • McAfee, M., P. Armstrong, and G. Cunningham. 2009. Achieving Effective Learning in Engineering Laboratory Classes. 5th International CDIO Conference, Singapore Polytechnic, Singapore. http://cdio.org/files/document/file/D6.1.pdf.
  • Pàmies Vilà, R., A. Fabregat Sanjuan, J. Puig Ortiz, L. Jordi Nebot, and A. Hernández Fernández. 2022. “Aplication of a Gamification Learning System in Mechanical Engineering Studies.” 50 Th SEFI Annual Conference, 48–57. https://upcommons.upc.edu/handle/2117/374374.
  • Potkonjak, V., M. Gardner, V. Callaghan, P. Mattila, C. Guetl, V. M. Petrović, and K. Jovanović. 2016. “Virtual Laboratories for Education in Science, Technology, and Engineering: A Review.” Computers & Education 95: 309–327. https://doi.org/10.1016/J.COMPEDU.2016.02.002.
  • Rivera, E. S., and C. L. Palmer Garden. 2021. “Gamification for Student Engagement: A Framework.” Journal of Further and Higher Education 45 (7): 999–1012. https://doi.org/10.1080/0309877X.2021.1875201.
  • Schnieder, M., S. Williams, and S. Ghosh. 2022. “Comparison of In-Person and Virtual Labs/Tutorials for Engineering Students Using Blended Learning Principles.” Education Sciences 12 (3): 153. https://doi.org/10.3390/EDUCSCI12030153.
  • Schofield, D., and E. Lester. 2010. “Virtual Chemical Engineering: Guidelines for e-Learning in Engineering Education.” Seminar. net 6 (1), https://doi.org/10.7577/seminar.2459.
  • Sharma, R. S. 2016. “Technology Enabled Learning of Metal Forming Processes for Engineering Graduates Using Virtual Simulation lab.” International Journal of Mechanical Engineering Education 44 (2): 133–147. https://doi.org/10.1177/0306419016640436.
  • Skulmowski, A., S. Nebel, M. Remmele, and G. D. Rey. 2021. “Is a Preference for Realism Really Naive After all? A Cognitive Model of Learning with Realistic Visualizations.” Educational Psychology Review 34 (2): 649–675. https://doi.org/10.1007/s10648-021-09638-1.
  • Spiro, R. J., P. J. Feltovich, P. L. Feltovich, M. J. Jacobson, and R. L. Coulson. 1991. “Cognitive flexibility, constructivism, and hypertext: Random access instruction for advanced knowledge acquisition in ill-structured domains.” Educational Technology 31 (5): 24–33.
  • Stuchlikova, L., A. Kosa, P. Benko, and P. Juhasz. 2017. “Virtual Reality vs. Reality in Engineering Education.” ICETA 2017 - 15th IEEE International Conference on Emerging ELearning Technologies and Applications, Proceedings. https://doi.org/10.1109/ICETA.2017.8102533.
  • Subhash, S., and E. A. Cudney. 2018. “Gamified Learning in Higher Education: A Systematic Review of the Literature.” Computers in Human Behavior 87: 192–206. https://doi.org/10.1016/J.CHB.2018.05.028.
  • Tan, S. W. B., P. K. Naraharisetti, S. K. Chin, and L. Y. Lee. 2020. “Simple Visual-Aided Automated Titration Using the Python Programming Language.” Journal of Chemical Education 97 (3): 850–854. https://doi.org/10.1021/ACS.JCHEMED.9B00802.
  • Ullah, S., N. Ali, and S. U. Rahman. 2016a. “The Effect of Procedural Guidance on Students’ Skill Enhancement in a Virtual Chemistry Laboratory.” Journal of Chemical Education 93 (12): 2018–2025. https://doi.org/10.1021/ACS.JCHEMED.5B00969.
  • Ullah, S., N. Ali, and S. U. Rahman. 2016b. “The Effect of Procedural Guidance on Students’ Skill Enhancement in a Virtual Chemistry Laboratory.” Journal of Chemical Education 93 (12): 2018–2025. https://doi.org/10.1021/ACS.JCHEMED.5B00969.
  • Venkatesh, V., M. G. Morris, G. B. Davis, and F. D. Davis. 2003. “User Acceptance of Information Technology: Toward a Unified View.” MIS Quarterly, 425–478. https://doi.org/10.2307/30036540.
  • Vergara, D., J. Extremera, M. P. Rubio, and L. P. Dávila. 2019. “Meaningful Learning Through Virtual Reality Learning Environments: A Case Study in Materials Engineering.” Applied Sciences 9 (21): 4625. https://doi.org/10.3390/APP9214625.
  • Vergara, D., P. Fernández-Arias, J. Extremera, L. P. Dávila, and M. P. Rubio. 2021. “Educational Trends Post COVID-19 in Engineering: Virtual Laboratories.” Materials Today: Proceedings, https://doi.org/10.1016/J.MATPR.2021.07.494.
  • Vergara, D., M. P. Rubio, and M. Lorenzo. 2016. “New Approach for the Teaching of Concrete Compression Tests in Large Groups of Engineering Students.” Journal of Professional Issues in Engineering Education and Practice 143 (2): 05016009. https://doi.org/10.1061/(ASCE)EI.1943-5541.0000311.
  • Vergara, D., M. P. Rubio, and M. Lorenzo. 2017. “On the Design of Virtual Reality Learning Environments in Engineering.” Multimodal Technologies and Interaction 1 (2): 11. https://doi.org/10.3390/MTI1020011.
  • Zhang, Y., P. Mao, H. Li, Y. Xu, D. You, H. Liu, W. Huang, and J. Yuan. 2020. “Assessing the Safety Risks of Civil Engineering Laboratories Based on Lab Criticity Index: A Case Study in Jiangsu Province.” International Journal of Environmental Research and Public Health 17 (17): 6244. https://doi.org/10.3390/IJERPH17176244.