2,103
Views
1
CrossRef citations to date
0
Altmetric
Articles

Preparative pre-laboratory online resources for effectively managing cognitive load of engineering students

ORCID Icon, , , ORCID Icon & ORCID Icon
Pages 113-138 | Received 04 Apr 2022, Accepted 11 Apr 2023, Published online: 04 May 2023

ABSTRACT

Laboratory learning forms a significant and integral part of engineering education, wherein students develop technical, collaborative, enquiry, and observation skills that go beyond theoretical studies. Engineering is inherently a practice-based profession, and consequently, laboratory work has been inseparable for engineering curricula. However, there is evidence suggesting that students encounter cognitive or information overload during modern laboratory classes, leading to reduced likelihoods of effectively achieving intended learning outcomes, as specified by the cognitive load theory. This article investigates how pre-lab online learning resources can be utilised to effectively manage cognitive load of engineering students, in the context of thermodynamics education. Qualitative and quantitative data have been collected from students as well as teaching staff on a range of parameters such as student preparedness, confidence levels, degree of engagement, and understanding. The findings of the study indicate that efficient student preparedness through pre-lab online resources can be utilised to effectively manage cognitive load of engineering students, leading to increased levels of understanding, preparedness, and confidence. It can be observed that there are influences from factors such as the quality of pre-lab resources and degree of engagement.

Introduction

Laboratory learning forms a significant and integral part of engineering education (Feisel and Rosa Citation2005; Baker Citation2005). Effectively delivered laboratory classes offer opportunities for engineering students to develop technical skills that go beyond theoretical studies and enhance problem-solving skills. In particular, engineering is a practice-based profession by nature and traditionally, laboratory work has been a vital part of undergraduate and postgraduate engineering curricula. Consequently, the laboratory component plays an inseparable role in effective delivery of engineering education.

According to the cognitive load theory (CLT), the human brain can process only a relatively small amount of new information at once, although it can process larger amounts of stored information (Kirschner, Sweller, and Clark Citation2006; Paas, Renkl, and Sweller Citation2003; Sweller, Van Merrienboer, and Paas Citation1998). Scholarly evidence suggests that there is tendency for the students to encounter cognitive overload during modern laboratory classes, which in turn reduces the likelihood of effectively achieving intended learning outcomes (Johnstone Citation1997; Jones and Edwards Citation2010). For example, the laboratory manual, unfamiliar equipment and materials, verbal instructions, and time management are identified as sources of cognitive overload for students (Reid and Shah Citation2007). In early studies by Johnstone, Sleet, and Vianna (Citation1994), it has been argued that consequently, student minds are observed to be preoccupied with all these information during laboratory classes, which causes them to mechanically follow instructions, hindering their preparedness and overall learning experience. More recent studies by Jolley et al. (Citation2016) and Rodgers et al. (Citation2020) support this argument.

To address this issue, national (Jolley et al. Citation2016) and international (Cann Citation2016; Johnstone, Sleet, and Vianna Citation1994) research studies have so far been conducted to qualitatively and quantitatively evaluate the effectiveness of pre-lab online resources in science-and-engineering higher education contexts. The key idea behind pre-lab preparation is the reduction or effective management of the amount of new information students are exposed to, during the laboratory (Reid and Shah Citation2007). The effort and time spent by students prior to lab classes, in preparation, can free up space in their working memory during the actual lab class, reducing the likelihood of cognitive overload and enabling deeper engagement with concepts and subsequent analyses (Winberg and Berg Citation2007). This is an argument supported by CLT (Paas, Renkl, and Sweller Citation2003). Simultaneously, it is noteworthy that the vast majority of the available literature in this regard emerges from the fields of chemistry and other general sciences, with a limited number of works reported in the area of engineering (Rodgers et al. Citation2020).

As there are fundamental differences between engineering and science labs, it is not possible to directly apply all findings derived for science labs seamlessly into engineering labs. This fundamental difference arises due to the foundational differences that exist between the two disciplines in terms of their investigative constructs. Some scholars generally define science as the determination of the general laws, theories, and the deduction of principles based on experimental observations (Ramsden Citation2012). On the other hand, engineering can be generally defined as the practical application and scaling-up of scientific principles coupled with imagination to create useful products and processes for the benefit of the society and the industry. Consequently, science lab classes are often dedicated to deeply understanding the fundamental laws and theories behind natural and physical phenomena. In contrast, engineering lab classes dominantly focus on the application- and practice-related phenomena of the principals at larger scales (Rahman Citation2017; Gillie et al. Citation2017). These fundamental differences in investigative constructs correlate to key differences in student thinking in science and engineering. Engineering students’ critical thinking is generally, more focussed on developing optimum and practical solutions to engineering problems which are naturally multi-dimensional. A report by the Royal Academy of Engineering describes this as utilising Engineering Habits of Mind (EHoM) to ‘making things that work’ and ‘making things work better’ (Lucas, Claxton, and Hanson Citation2014). This typically depends on structured and complex thought processes that demand evaluation, interpretation and opinion (Ahern et al. Citation2019). Science students’ thinking is more focused on grasping deeper levels of understanding of fundamental principles behind natural and physical phenomena, in contrast to engineering students’ thinking that focus more on scaling-up, practical applications and efficiency improvements based on those fundamental principles. Subsequently, engineering labs focus on how to develop fundamental scientific principles into more applicable, manufacturable, and commercial levels in terms of design, manufacturing, and maintenance.

In this background, this article aims to qualitatively and quantitatively investigate how pre-lab online learning resources can be utilised to effectively manage the cognitive load of engineering students. In doing so, pre-lab online multimedia resources have been introduced to a final-year engineering thermodynamics course unit of a Bachelor of Engineering (Honours) degree. The pre-lab online preparatory resources used in this research aims to distribute and effectively manage cognitive load of students by exposing them to the required laboratory knowledge, theories, concepts, and protocols in a two-week period before their lab classes. As discussed above, this study has been designed based on the premise that the time dedicated by students in preparation for their laboratory class would free up space in their cognitive capacity and reduce the amount of new information handled at any given time, which would permit them to focus and perform better in laboratories, and decrease the possibility of cognitive overload as per CLT. This could create a higher probability of an in-depth and meaningful engagement with content matter, which is crucial for the attainment of the intended learning outcomes.

This article is based on initial research findings from the first year of an ongoing four-year (2020–2024) longitudinal study conducted at Queensland University of Technology (QUT) based in Brisbane, Australia. This article analyses data for the study-year 2020 although the overall longitudinal study runs till 2024. It should also be noted that for comparison purposes, archival records on student grades and marks for the study-year 2019 have also been considered. The effectiveness of the pre-lab materials to manage student cognitive load has been evaluated mainly through a student attitudinal survey and a staff focus group. As the initial phase of this research took place during the first wave of COVID-19 in Australia, a number of new challenges affected student learning, preparedness, confidence levels, and cognitive load. However, it should be noted that evaluating the effects of the pandemic is not a research objective of this investigation, although there could be confounding effects arisen towards student learning.

The key Research Questions (RQs) that have been driving this research investigation are:

RQ1. How prepared and confident do students perceive themselves to be after using pre-lab resources?

RQ2. How prepared and confident do teaching staff perceive students to be after using pre-lab resources?

RQ3. Based on these proxy measurements (i.e. student preparedness and confidence levels), how do pre-lab resources affect student cognitive load?

This paper has been structured as follows. In the Methodology section, key details of the theoretical background, research study design, data collection, coding, and analysis have been provided in a descriptive manner. The Results and Discussion section elaborates the evaluation of correlations among pre-lab learning resources, cognitive load, preparedness, confidence levels, and understanding through the perspectives of students and teaching staff while establishing direct links to Research Questions (RQs). Finally, in the Conclusions and Recommendations section, key findings, limitations, and recommendations for future work have been outlined.

Methodology

This study adopts a mixed-methods research approach to investigate how pre-lab online learning resources can be utilised to effectively manage cognitive load of engineering students. In doing so, qualitative and quantitative data have been collected from students as well as teaching staff on a range of parameters such as student preparedness, confidence levels, degree of engagement, and understanding.

Theoretical background

This study utilises the lens of CLT, which states that the human brain can process only a relatively small amount of new information at once, although it can process larger amounts of stored information in the form of schemas (Sweller, Van Merrienboer, and Paas Citation1998; Kirschner, Sweller, and Clark Citation2006; Paas, Renkl, and Sweller Citation2003). CLT is fundamentally an instructional theory built on our knowledge of human cognition. It is based on a number of widely accepted assumptions on how human brains process and store information. For example, the human memory can be divided into working-memory and long-term memory; information is stored in the long-term memory in the form of schemas; and processing new information results in ‘cognitive load’ on working memory which can directly affect learning outcomes (Gerjets, Scheiter, and Cierniak Citation2009). According to CLT, working memory can be affected by the intrinsic nature of the learning materials, the manner in which the materials are presented and the activities required of the students, leading to different types of cognitive load: intrinsic, extraneous, and germane (Abeysekera and Dawson Citation2015; Buchner, Buntins, and Kerres Citation2021). The intrinsic load is related to the inherent nature of a given task, problem or concept while the extraneous load is relevant to the additional load projected upon the cognition based on how the problem or concept is presented to the learner. The germane load is the additional load that assists the formation of schemas in the long-term memory, which is helpful for learning. For a given task, effective management of intrinsic, extraneous, and germane loads can be used in design for learning to optimally utilise the working-memory and formation of schemas in the long-term memory.

Although all three types of cognitive load could potentially be affected through the proposed pre-lab resources framework, it is anticipated that its positive influence on extraneous and germane loads would lead to a more effective management of the overall cognitive load. That is, through the introduction of pre-lab resources, the learning task is presented to the students in a more distributed and modular approach, leading to increased preparedness and reduced extraneous load at a given point during the overall learning exercise. Further, through structured, distributed and scaffolded influence of the proposed pre-lab resources framework on the actual lab classes, it is expected that the formation of schemas in the long-term memory becomes more effective and convenient, affecting the germane load. It should be noted that due to the change of the actual number and nature of tasks the students are conducting with respect to the lab classes, there could be a potential increase in the intrinsic load. However, it is expected that the combined positive influence on extraneous and germane loads would far outweigh any increment of intrinsic load.

Measuring cognitive load through proxies

Previous research studies demonstrate that there is a strong correlation between cognitive load (or available cognitive capacity) and student preparedness (Akkaraju Citation2016; Jolley et al. Citation2016; Jones and Edwards Citation2010). For example, if a student possesses more ‘preparedness’ for a given lab class, that leaves more cognitive capacity in their working-memory to successfully conduct relevant experiments, observe results and make subsequent conclusions, which is an argument supported by CLT as discussed above. In addition, there is evidence suggesting a direct correlation between cognitive load and confidence levels (Gavas et al. Citation2018). For instance, if a student has more cognitive capacity in their working-memory to meaningfully engage with experiments, there is a higher probability that they will build more confidence during the overall learning exercise. On the other hand, undesirably high levels of cognitive load can lead to poor confidence levels. Accordingly, an overall procedural correlation among cognitive load, preparedness, and confidence levels can be drawn as given in . Based on this premise, in this investigation, we have utilised ‘student preparedness’ and ‘confidence levels’ as ‘proxies’ to measure corresponding cognitive loads.

Figure 1. A procedural correlation among student preparedness, cognitive load and confidence levels.

Figure 1. A procedural correlation among student preparedness, cognitive load and confidence levels.

However, it should be simultaneously noted that, apart from cognitive load and preparedness, there can be a variety of factors that influence confidence levels of students. For example, mental and physical wellbeing, individual preferences, and other socio-economic conditions, to name just a few. It should be noted that the influence of these other factors towards cognitive load is not being measured in the current study as it is beyond the scope, although they might lead to interesting future research. Furthermore, directly and conveniently measuring mental load has persistently proven to be very challenging with respect to CLT research (Kirschner, Ayres, and Chandler Citation2011; van Gog and Paas Citation2008). A couple of reasons behind this challenging nature are the perceived task difficulties and invested mental efforts leading to non-equivalent ratings; and intrusive nature and technological demand of objective online measures such as heart-rate variability. Although subjective rating scales of measuring cognitive load such as Cognitive Load Scale (CLS) (Leppink et al. Citation2013) and Naïve Rating Scale (NRS) (Klepsch, Schmitz, and Seufert Citation2017) could be useful tools for assessing the perceived relative cognitive load, the reliability and internal consistency of such subjective measures could be questionable (Thees et al. Citation2021). Nasa Task Load Index (Nasa TLX) (Hart and Staveland Citation1988) is another common subjective measure of relative cognitive load, which has its own limitations of consistently estimating actual cognitive load levels (Buchner, Buntins, and Kerres Citation2021). Considering all this, we have utilised ‘preparedness’ and ‘confidence levels’ as ‘proxies’ to measure student cognitive loads during this current investigation. This ‘proxy’ selection also leads to convenience of data collection and analysis during the research investigation while providing a reasonably accurate indication of the overall cognitive impact (Paas and Van Merriënboer Citation1993). However, it should be noted that the use of objective online measures and subjective rating scales could be interesting avenues for insightful future research.

Study design

In July 2020, the research team introduced pre-lab online learning resources to a final-year undergraduate thermodynamics laboratory class (EGH422-Advanced Thermodynamics), consisting of 218 students. Since then, as part of this ongoing study, pre-lab learning resources have been a part of students’ preparation for lab classes. Access and utilisation of pre-lab online resources were integrated into the curriculum as a standard requirement for all enrolled students in this course unit and made available on the university’s Learning Management System (LMS), Blackboard (Citation2020).

Three different forms of preparatory resources were provided to the students to prepare for the Simulation (Computer Lab) and Heat Exchanger Experiment classes: pre-lab videos, pre-lab handouts, and pre-lab e-quizzes. These two lab classes are critical components of this course unit as they provide a means of practical and real observation of thermodynamics concepts. The design and implementation of this pre-lab resources framework were undertaken in such a manner that it would have an overall desirable impact on the cognitive load of students. As discussed previously under the Theoretical Background sub-section, it was expected that the combined positive influence of the framework on extraneous and germane loads would far outweigh any increment of intrinsic load.

Pre-lab videos

Audio-visual resources such as videos were developed in a manner that provides visual and auditory explanations to reinforce the knowledge of thermodynamics concepts used in the laboratory (Ashaver and Igyuve Citation2013; Ulloa Salazar and Díaz Larenas Citation2018; Gillie et al. Citation2017).

provides details on videos available to students, as part of pre-lab learning for the ANSYS Simulation (also known as ‘Computer Lab class’). ANSYS is a Computer-aided Engineering (CAE) and Multiphysics simulation software used in various engineering disciplines (ANSYS Citation2020). In this course unit, ANSYS Fluent 19.0 was used for Computer Lab Classes.

Table 1. Details of pre-lab videos provided to students as part of their preparation for the ANSYS Simulation laboratory.

Similarly, provides details on videos the students were assigned to watch, as part of pre-lab learning for the Heat Exchanger Experiment classes.

Table 2. Details of pre-lab videos provided to students as part of their preparation for the Heat Exchanger laboratory.

Pre-lab handouts

Pre-lab handouts typically consisted of a four-page PDF (portable document format) document made available to students via Blackboard (the LMS) with laboratory information such as experimental aims, data and equipment-related instructions, procedures, and assessment details.

consists of specific information on pre-lab preparative handouts students were assigned to watch, as part of pre-lab learning for the ANSYS Simulation and Heat Exchanger Experiment classes.

Table 3. Pre-lab handouts provided to students as part of their preparation for the ANSYS Simulation and Heat Exchanger laboratory classes.

Pre-lab e-quizzes

E-quizzes were employed at multiple stages to provide summative and formative self-assessment for students, assessing their understanding of key subject matter relevant to the laboratory. Students were encouraged to complete two short e-quizzes consisting of multiple-choice questions prior to attending each lab (i.e. the ANSYS Simulation and the Heat Exchanger Experiment). Each quiz was allocated 2.5% marks (out of a total of 100) to motivate students to take part in quizzes and other pre-lab resources. Although this was a relatively small percentage out of the total mark, it led to an average completion rate of around 94% for both e-quizzes.

and contain information on the e-quizzes the students were assigned to complete, as part of pre-lab learning consolidation for the ANSYS Simulation and Heat Exchanger classes, respectively.

Table 4. Details of pre-lab e-quizzes provided to students as part of their preparation for the ANSYS Simulation laboratory.

Table 5. Details of pre-lab e-quizzes provided to students as part of their preparation for the Heat Exchanger laboratory.

Data collection

In this study, both qualitative and quantitative data were collected in order to ensure a robust dataset. At the end of the academic year of study, in November 2020, effectiveness of pre-lab online resources in managing student cognitive load and enhancing overall learning experience was evaluated via a student attitudinal survey and a staff focus group. It should be noted that all these data were collected for the 2020 cohort (July–November) of this course unit. All procedures performed in this study involving human participants were in accordance with relevant ethical approval at the institutional level (QUT Human Research Ethics Committee (UHREC) Ethics Approval Number: 2000000667). For the use of anonymised student grades data, a waiver of consent was granted by the QUT UHREC based on the criteria in the Section 2.3.10 in the National Statement on Ethical Conduct in Human Research of Australia (‘National Statement on Ethical Conduct in Human Research 2007 (Updated 2018)’).

Student attitudinal survey

The student survey consisted of 32 questions and was designed to collect information regarding student demographics, difference in workload, preparedness, confidence, as well as the overall satisfaction and impact on learning. Some of these questions directly or very closely asked about preparedness of students, which is a proxy utilised in this study to measure cognitive load (e.g. Questions 7, 12, 14, 15 and 16 in Appendix 2). In addition, a number of specific questions were included to evaluate the perceived quality and effectiveness of pre-lab resources in developing students’ understanding. A key reason behind using these questions on developed understanding is that the individual beliefs of students in their developed knowledge and ability to solve problems can directly map to their preparedness and confidence levels on subject matter (Dray et al. Citation2011). The intention here was to add a secondary layer of validation and mapping to measure cognitive load of students as influenced by pre-lab resources.

The survey was created using Microsoft Forms (Microsoft-Corporation Citation2020) and was distributed via email and through online LMS (i.e. Blackboard), at the end of academic period for the July-2020 Advanced-Thermodynamics cohort. This type of a survey and distribution mechanism was selected by considering the wide accessibility, convenience of participation, and time-efficiency for large cohorts as key criteria.

The survey was voluntary in nature and allowed students to maintain anonymity while participating and also ensured student consent through a disclaimer at the beginning of the survey that indicated undertaking the survey equates to automatic consent. The survey questions consisted of a mix of different styles of questions, in order to attract a varied and rich set of responses. The question types included: open-ended, multiple-choice, ordinal scale, Likert scale, and star rating questions.

To promote student participation, Blackboard and lecture announcements were made. It was initially expected that, out of 218 students, at least 20–30 students (10–15%) would take part in the survey. The response rate exceeded this expectation, with 17.4% of students participating, answering all 32 questions in full.

Staff focus group

The staff focus group was conducted with all eight staff, who were involved in teaching the July-2020 Advanced-Thermodynamics cohort. All these teaching staff had already been involved with this course unit in different roles for at least two iterations by that time, most of them for longer. The teaching staff were all familiar with pre-lab learning resources and were chosen for the focus group in order to gather their input on perceived student understanding/performance in lab-classes and course unit as a whole. Perceived cognitive load was specifically asked about during the focus group. In addition, perceived quality of pre-lab learning resources was a key criterion intended to be assessed. A focus group was selected for this particular data collection activity because it allowed for a broader conversation among the teaching group with significant potential to yield rich insights and critical dialogues.

Participation was voluntary and undertaken only upon a written consent from each staff member. A participant information sheet was provided outlining objectives of research, what is required of participants, expected benefits, risks, privacy & confidentiality matters, consent to participation, and contact details of researchers. For concerns and complaints, a link to the university’s Office of Research Ethics and Integrity was also provided. The information sheet further clarified that one could withdraw their participation at any time.

Considering the risks with respect to the COVID-19 pandemic and the required precautionary measures, the focus group was conducted via a widely-used teleconferencing software program, in November 2020, lasting approximately two hours. The audio-recorded session was led by a facilitator who steered the discussion through open-ended questions covering teacher perspectives on student understanding, performance, confidence levels, and cognitive load as well as the quality and impact of pre-lab resources. The teaching staff were also asked for suggestions and recommendations to improve pre-lab learning resources further.

Data coding and analysis

Both qualitative and quantitative data were de-identified and anonymised at the coding stage through application of pseudonyms such as Student A, Student B, Staff A, Staff B, etc. The pseudonyms were randomised and created in a random order to ensure that identification of participants (both student and staff) does not occur in any pre-determined manner. Three researchers from the team coded data to reduce bias and validate emerging themes (Eisenhardt Citation1989). There was an acceptable level of inter-coder agreement of around 80% (Lombard, Snyder-Duch, and Bracken Citation2002). As different researchers were familiar with the course unit and thermodynamics at different levels, a comprehensive corroboration process was undertaken to resolve disagreement through multiple discussions among the whole research team. A thematic analysis (Braun and Clarke Citation2006) was employed to find emerging patterns of meaning in the data set (interview/focus group transcripts), where similarities/themes were identified. This thematic analysis was conducted in an inductive manner without a pre-defined list of codes, wherein data was allowed to determine subsequent codes. Qualitative data was fed into NVivo software (QSR-International Citation2020), a qualitative data analysis software program, wherein codes and nodes were created, themes were derived and codes were collated. All themes were grouped and arranged in themes, themes were then reviewed and revised. All codes/quotes were rearranged, altered, removed or merged, depending on needs. One example for a corresponding visual map can be found in under the Results and Discussion section. For the benefit of interested readers, a couple more example-figures of visual maps, charts, and node-networks have been added in Appendix 1 of this article. Once these steps were completed and the coded data was exported to a document, writing of the narrative took place, involving summarising and interpretation of data.

Figure 2. A visual map of information derived via thematic coding of focus group data.

Figure 2. A visual map of information derived via thematic coding of focus group data.

Results and discussion

Preparedness and confidence levels of students after using pre-lab resources, based on their own perspectives (RQ1)

It was observed that by taking student focus away from basic doubts and lack of clarity about laboratory experiments, pre-lab resources helped them develop an understanding of threshold thermodynamics concepts (Khawaja et al. Citation2013), with nearly 90% of student respondents indicating efficacy of the provided materials in enhancing their understanding. However, it can be understood from student survey that despite an overall satisfaction regarding efficacy of resources in preparing students for the laboratory, when asked about clarity of the resources, less than half of the student respondents confirmed having full clarity. 39.5% of student respondents confirmed clarity and understandability of resources, while majority of remaining respondents showcased some scepticism, through both quantitative and qualitative answers that contained words such as ‘uncertain’ or ‘somewhat’. Assessment of each individual pre-lab material elucidated potential reasons further. For instance, only 28.9% of student-survey respondents confirmed efficacy of pre-lab handouts in reinforcing student understanding of threshold thermodynamics concepts, while 60.5% showcased some scepticism in their efficacy, and the remaining 7.9% denied any efficacy. Lal et al. (Citation2020) discussed comparable results by pointing out that quality and depth of pre-lab handouts can influence how students perceive laboratory classes. This indicates a strong association between student preparedness and understanding, which depends on pre-lab materials provided, along with their quality and depth. Similarly, only 31.6% of students perceived pre-lab resources to facilitate their understanding of laboratory procedures and experiments. These results display a further area of research that needs to be addressed. For example, student survey results could be coupled with Concept Inventory Testing (Midkiff, Litzinger, and Evans Citation2001; Martín-Blas, Seidel, and Serrano-Fernández Citation2010) to strengthen validity and richness of the analysis. It should be noted that in the main text of manuscript, we have opted to describe and interpret this data in words, which allows for better representation and comprehension of some seemingly complicated results. However, for interested readers, we have placed a set of corresponding survey questions and basic summarised results in Appendix 2.

Previous studies on pre-lab online learning emphasise that effective pre-lab resources help students exhibit their acquired subject knowledge with better clarity via application of that knowledge in actual lab classes (Winberg and Berg Citation2007; Haagsman et al. Citation2021). In addition, they discuss how these resources help students reflect on more important theoretical concepts during class, rather than on trivial doubts. Meanwhile, poor clarity or lack of complete information could impact student success and add to their cognitive load (van Raalte and Boulay Citation2013; Tran et al. Citation2019). Despite a need for increased clarity, results of this study suggest that pre-lab preparation has allowed students to gain content knowledge, as well as better understanding of laboratory tools and sense of ‘navigation’ through practicals:

I think giving students these resources is very helpful as it allows more time to understand what is happening, and means that more relevant questions can be asked in the physical lab classes. Additionally, these resources being constantly available were very helpful in the report writing and analysis process. (Student G)

This signals the utility of pre-lab resources to effectively manage student cognitive load through efficient preparation and increasing the degree of pre-established understanding, indicating influences on extraneous and germane loads. In the context of CLT, this implies an enhanced ability to form schemas in long-term memory, while reducing load on short-term memory. For example, structuring and refining knowledge through ‘asking more relevant questions’ as per the student comment above. This demonstrates how the deployed pre-lab resources positively affect extraneous and germane loads while effectively managing any impact on intrinsic load, leading to a desirable impact on the overall cognitive load of students, in general. Concurring with these observations, similar studies reveal that students who engage in pre-lab preparation have a better understanding of subject matter and greater awareness of experimental procedures, compared to those who do not have access to pre-lab preparatory materials (Haagsman et al. Citation2021). It has also been shown that students are able to make better sense of the learning outcomes and laboratory tasks (Cann Citation2016).

In addition, students and staff associated preparedness with confidence, wherein it was observed that students were not only capable of following what was being taught in class, but also engage in classroom learning confidently and progress faster than they would have without the resources:

Looking at the practical handouts and videos were enough to understand the practical. I had work and so was unable to attend the practical, [yet] I still had a confidence level of understanding the practical content to complete the corresponding assessment. These were very helpful and I found myself watching/reading these resources numerous times. (Student B)

It is worth noting how this student comment relates pre-lab resources with increased levels of confidence, which is a proxy being utilised to measure cognitive load in this study. In light of this student commenting that they were able to complete assessment without attending lab class, there is a chance that one might read into this as online pre-lab resources being able to replace actual meaningful in-class lab work. It is worthwhile pointing out that due to the wide-range of learners and learning capabilities existing in a class of 218 students, there are likely to be a few exceptional students who could connect most of dots by looking at informative pre-lab resources and complete a report without attending the lab. Pre-lab resources should-and-could not replace actual-and-meaningful interactions taking place in class, and it was not the objective and eventual overall outcome of the deployed pre-lab online resources. It should actually be noted that both ANSYS Computer Lab and Heat Exchanger Lab were very well-attended during the study period of concern, as per the anecdotal evidence of teaching staff. For the analysis of Heat Exchanger Lab Report, students needed measurements from the actual in-person experiment.

Comparable research underline the potential of pre-lab online resources in honing student skills and experience through nurturing of their self-confidence (Seery et al. Citation2017). For instance, studies that utilised pre-lab videos suggest that students showed signs of increased levels of confidence during laboratory experiments, after watching videos detailing or simulating the experiments (Rodgers et al. Citation2020; Fitriani, Paristiowati, and Mukarromatunnisa Citation2019). By enabling student confidence, pre-lab resources generate motivation (Barrie et al. Citation2015), further enhancing overall learning experiences (Box et al. Citation2017; Chaytor, Al Mughalaq, and Butler Citation2017; Jolley et al. Citation2016). Furthermore, students perceived pre-lab learning resources as tools that can enhance their learning, confirming their efficacy.

When questioned about helpfulness of self-assessment component of pre-lab materials (i.e. e-quizzes) in consolidating student learnings and in effectively preparing them for practicals, nearly 95% of student survey respondents reported to have found them either ‘somewhat helpful’ or ‘very helpful’. Scholars agree that pre-lab learning resources allow for the improvement of overall student learning, assimilation and success (Deane, Wisner, and Byram Citation2021; Haagsman et al. Citation2021), leading them to even outperform students who do not have access to pre-lab materials (Basey et al. Citation2014). Thus, student preparedness for laboratory classes creates the opportunity for higher-order cognitive learning, whereby students are able to focus on processing, assimilating, and consolidating acquired knowledge through application, analysis, and synthesis, which is an indication of more effective management of cognitive capacity (Basey et al. Citation2014).

Preparedness and confidence levels of students after using pre-lab resources, based on teaching-staff perspectives (RQ2)

The role of pre-lab online materials in student preparedness was evidenced during focus group session with teaching staff, who had all been involved with the course unit for at least one previous iteration, most of them more (two of the teaching staff-four years; four staff-three years; two staff-two years). Comparing with the past teaching term, which did not offer pre-lab resources, respondents highlighted the positive impact resources have had on student readiness for class. Most agreed that pre-lab materials, which not only included theoretical content but also experimental data, triggered student curiosity and allowed them to form their own hypotheses ahead of lab classes. It was also observed that the resources decreased teacher-centred classroom time and increased self-reflection and learning, thereby promoting a more effective ‘flipped learning’ environment.

Due to pre-lab preparedness, students seemed to have fewer doubts and better clarity during practicals, leading to higher levels of confidence. From the lens of CLT, this proxy-based observation indicates a lightening of student cognitive load, paving the way to enhanced learning during class. These allow for a focus on engineering analysis and final outcomes, which in this case is a laboratory report:

Although there weren’t as many questions this year to [be brought to] the computer lab … . because students were following along, I feel like they definitely understood the material that was being taught as, at the end of the class, I think it was reflected by how many students were asking … I guess higher order or more or less general questions at the end of the computer lab … They were asking questions about the lab report as opposed to how to actually conduct the simulations. (Teaching staff F)

When I was a student last year, I found that the flow of the class was impacted by just a lot of students asking questions or being confused about general questions regarding the simulations that could have easily been answered through the use of pre-lab materials or additional information prior to the laboratory. I found a lot of students weren’t able to follow along as a simulation was being delivered by the computer lab instructor. (Teaching staff C)

These teaching staff comments imply the created opportunity for students to ask more relevant and deeper questions during lab classes. According to CLT, this is more supportive for developing schemas in long-term memory as reinforced by effective management of cognitive load through pre-lab resources. Also, increased levels of student understanding, confidence, and preparedness were evident through staff perspectives. For instance,

It seems like students had a better understanding of (…) from the pre-lab videos at least, (…) the apparatuses [i.e., equipment and instrumentation] being used for the measurements of the experiments as well as the general process and the thermodynamics understanding of what was being delivered in the experiment. (Teaching staff D)

I feel like overall; they were definitely more confident. I could see quite a few students jumping ahead … going through the process faster than I was delivering it in the classroom. They definitely seem to be more confident in how to actually conduct the simulations, which tells me that they actually either looked through the video … the pre-lab video or they looked through the pre-lab PDF resources – so the step-by-step instructions provided in the pre-lab material. (Teaching staff H)

Regarding the handouts, the general idea was that they were definitely useful for the students to get an idea regarding the lab. And, regarding the quizzes, almost all the students had done the quizzes before they came to the class and that seemed to be helping the students to a reasonable level. (Teaching staff G)

These teaching-staff comments indicate the influence of pre-lab resources towards student preparedness and confidence levels, the proxies that are being used to measure cognitive load in this study. These teaching-staff perspectives can be further generalised and validated through visual mapping of information derived via thematic coding of focus group data, as seen in (for interested readers, additional figures have been provided in Appendix 1: Thematic coding and visualisation of research data).

These insights re-confirm some of previously-derived insights for the correlation between student cognitive load and pre-lab learning resources. In addition, this evidence directly implies the positive impacts of pre-lab resources on student participation, learning and understanding.

It should be noted in parallel that there can be multiple moderating variables affecting overall perceived cognitive load of students (e.g. the impact of COVID-19 in this study), in addition to the utilised proxies. As such, when considering key insights derived through this study, confounding influences of such moderating variables should be noted simultaneously. However, overall, it can be seen that student and staff perspectives from this investigation combine together to coherently elaborate the relationship between student preparedness and confidence levels (the proxies), and cognitive load. In future research, combining these proxies with the aforementioned subjective rating scales such as CLS, NRS and Nasa TLX (or modified versions) could lead to more insightful, validated, and stronger research findings, holistically.

How pre-lab learning appears to manage student cognitive load based on the utilised proxy measurements (RQ3)

As evident from student and staff perspectives discussed so far, there seems to be a range of opinions from students and staff on pre-lab resources and their influence on managing student cognitive load. The general consensus among most teaching staff was that pre-lab online resources had a significant impact on student laboratory performance, as they contributed towards effectively managing cognitive load that students experience during laboratory classes. The teaching staff highlighted a positive change in student attendance, alertness, and laboratory learning:

They would just have to sort of think about and prepare for the lab, so they understood the lab more thoroughly when they actually got there and then they were able to interpret the results and do their associated analysis. (Teaching staff K)

Another point is, they already know what they have to do in their lab report so they are more concentrated on what they have to do [in class]. (Teaching staff C)

These comments demonstrate how pre-lab resources have been useful in improving student preparedness for lab-classes. While literature linking pre-lab learning, cognitive load, and student attendance are scarce, one can draw lessons from a pedagogical approach that has been becoming increasingly popular: ‘flipped classrooms’. In the ‘flipped classroom’ approach, activities traditionally conducted within the classroom become pre-classroom activities, which frees up classroom time that can be used for more effective and meaningful engagements (Akçayır and Akçayır Citation2018), creating more possibilities for schema formation. The flipped classroom approach is inherently more learner-centred and can lead to more active and dynamic learning environments (Sohrabi and Iraj Citation2016). Based on this information, it can be seen that the proposed pre-lab-online-resources approach in the current investigation has a strong flipped classroom alignment by encouraging thermodynamics students to take part in learning activities prior to attending actual classes. It is also worth noting that there is evidence supporting the use of flipped classroom approaches to effectively manage cognitive load (Abeysekera and Dawson Citation2015). In addition, a number of earlier studies have demonstrated that flipped learning has a significant impact on class attendance and performance (Tune, Sturek, and Basile Citation2013; O'Flaherty and Phillips Citation2015; Smallhorn Citation2017). A recent study showcased the impact of using a flipped classroom approach, where attendance increased by 95.4%, with 73.3% of students attending all lectures (Goedhart et al. Citation2019). These academic findings corroborate with views of teaching staff from the present investigation with regard to student attendance after online pre-lab preparations.

A research study by Habib (Citation2020) revealed that pre-lab resources immensely helped students, leading them to display high levels of concentration during laboratory activities and increased engagement with lab instructor. The author further noted that students were more alert, receptive to information, and had developed an increased capacity to answer questions asked by the instructor during lab time. This suggests that pre-lab learning has demonstrated potential in managing/reducing cognitive load and increasing student performance in class. It should be noted that a few respondents (both students and staff) from the present study perceived resources to make no changes to the existing total cognitive load since a significant part of it had already moved to pre-lab learning phase instead of the laboratory in-class phase. However, others go on to observe how pre-lab learning helped student preparation either by reducing cognitive load or by more effectively distributing learning equally throughout the study period. By effectively managing cognitive load of students, pre-lab resources were noted to provide a better learning experience:

To me, I’d say the overall cognitive load was either the same or less and mostly I think it was even less and one reason I have for that is that the number of enquiries by students about the lab, the meaning of the results, how to prepare the report and so on. (Teaching staff J)

I think it actually distributed the workload the students carried because, usually the students wait until the last moment or Week 12 or 13, or whatever the assignment deadline is to do these things. By introducing the pre-lab resources and the quizzes, we actually tried to bring that forward to either Week 4 or 6 so that students can be familiar with the content that they actually have to go through for their assessment much earlier. So, I think in terms of that … It actually kind of made the workload more even. (Teaching staff E)

It should be noted that students were provided with a sufficient amount of time to peruse pre-lab resources prior to attending lab-classes. Alongside the distribution of student cognitive load, pre-lab materials also served as an effective tool in reducing pressure and stress that students may feel during lab classes if they had been unaware of the experiments or simulations taking place:

It actually doesn’t add to the overall cognitive load. I think it somewhat distributed it [cognitive load]. (…) If you talk about the cognitive load and subsequent stress that students go through during the assessment or the exam period, I think the pre-lab resources (…) help to even it out and further distribute and gave students time to go through it with less stress. (Teaching staff G)

So, for the pre-lab videos and the quizzes, we’re like … it is beneficial for the students that they will have an idea about what they will actually be doing in the lab practicals, rather than just coming empty-minded in the lab. (Teaching staff E)

Early theorists Piaget (Citation1952), Novak (Citation1993) and Novak (Citation2010) identified cognitive learning and development to be central to meaningful and effective student learning. Post-millennium studies show that students’ attitudes towards laboratory classes are more positive and indicative of enhanced motivation, triggered by the preparatory materials provided such as simulations (Josephsen and Kristensen Citation2006; Bortnik et al. Citation2017), further reducing cognitive load and any stress relating to work in the practical laboratories (Lamichhane and Maltese Citation2019).

Students viewed pre-lab online learning resources and the relationship with their cognitive load in a slightly different manner. In terms of time consumption, 75.6% of students indicated they had to spend between 1–2 h in preparation and 79% of students found the time given to be sufficient to cover pre-lab materials. These results have been illustrated in and which were drawn from student survey results. In comparison, it is worthwhile considering the time that was allocated to prepare for the same lab-classes during previous iterations of the course (i.e. before pre-lab online resources were introduced). During previous iterations, around 15–20 min were allocated at the beginning of each lab-class to go through the corresponding lab-instruction manual (similar to the pre-lab handout). According to anecdotal evidence from teaching staff, although this resource had been provided to students well ahead of time via the LMS, vast majority of the students perused this document for the first time during this 15-20-minute period, in contrast to 1–2 h spent by majority of the students, pre-lab, after the introduction of pre-lab online resources.

Figure 3. Student preparation time using pre-lab resources.

Figure 3. Student preparation time using pre-lab resources.

Figure 4. Student perception of pre-lab preparation time allocation.

Figure 4. Student perception of pre-lab preparation time allocation.

It is worth considering and elaborating further how this information connects to CLT. For instance, it is also important to ensure that students were not overwhelmed nor confused by pre-lab resources, as this can have an adverse impact towards successfully achieving learning outcomes (Bruen and Kelly Citation2017; Moos and Pitton Citation2014). To further explore this, the following information drawn from the student survey regarding student interactions with pre-lab resources can be helpful.

and 7 illustrate pre-lab-video views by students, with the highest average of views being at least 2 times for each activity:

Table 6. Student viewership for the Heat Exchanger pre-lab video.

Out of 218 students, vast majority had completed watching the ANSYS Simulation (Computer Lab) and Heat Exchanger Experiment videos, with 85-90% viewership for all pre-lab videos.

Further, student satisfaction ratings on various pre-lab resources shown in tell a positive story about their experience with pre-lab resources.

Table 8. Student satisfaction ratings for different pre-lab materials.

While individual ratings for pre-lab handouts and videos earned 4.43/5.00 and 4.27/5.00, respectively, the overall satisfaction rating of 4.35 out of 5.00 stands as a significant indicator of the positive interaction of students with the introduced pre-lab learning resources.

This information coupled with the fact that almost 80% of students indicated that they had sufficient time to go through pre-lab resources () imply that the students were not generally overwhelmed nor confused by pre-lab resources. This demonstrates a healthy impact on overall cognitive load of students. This also implies that the probability of students who viewed Heat Exchanger and ANSYS Simulation pre-lab videos thrice (as per and ) being confused or overwhelmed is very low. Combination of these facts synergistically validate the initially developed hypotheses of pre-lab resources being able to have an overall positive impact through effectively managing extraneous and germane loads, while strongly outweighing any undesirable impact towards intrinsic load.

Table 7. Student viewership for the ANSYS Simulation video.

According to CLT, human working memory has a limited ability to process information fed, depending on age category, especially in children, elderly, and young adults (Lamichhane and Maltese Citation2019; Paas and Ayres Citation2014). For this reason, it becomes crucial for educators to consider ways to effectively manage student cognitive load during the overall learning and teaching process (e.g. throughout a semester). One such way could be the consideration of length and complexity of resources, as well as time required to complete studying or preparation, which can help manage student cognitive load without leading to overloading. The evidence presented in and along with above, seems to well-resonate with this premise. Furthermore, Yerkes-Dodson law states that the peak performance is reached at moderate levels of stress or arousal while too little or too much stress/arousal leads to poorer levels of performance (Yerkes and Dodson Citation1908; Teigen Citation1994). It is intended that distribution of new information and instructions for lab-classes through pre-lab resources would lead to moderate levels of stress helping students reach peak-level performances in the context of this course unit.

Although CLT is useful in identifying strategies and mechanisms to improve cognitive learning outcomes, there are limited prescriptions as to how it links with emotional and motivational facets of learning (Feldon et al. Citation2019; Ginns and Leppink Citation2019). Yet, multiple research works concur that cognitive load has a direct impact on motivational beliefs, regardless of student performance in class (Likourezos and Kalyuga Citation2017; Huang Citation2017; Feldon et al. Citation2018). While handouts and videos acquired a positive response from both staff and students, the same could not be stated of e-quizzes. In particular, a number of students found e-quizzes to be a burden rather than a learning consolidation exercise, implying a certain level of cognitive load increase:

I didn't find any value in them [i.e., pre-lab e-quizzes]. They didn't offer much in terms of checking my understanding of concepts and were instead just an added thing to worry about. However, if more were added, it would be helpful to explicitly state that these could be used as ‘check lists’ for ensuring we have understanding in the theory concepts. (Student A)

Actually, a lot of students either found them difficult or quite challenging, so there was a wide variety in our students perceiving [unintelligible] the questions. In regard to having to actually do the pre-lab quiz questions, a lot of the students … , the consensus was that the students did find it a little tedious. (Teaching staff B)

The reason for the above sentiments could be because of the fact that e-quizzes formed a part of this course unit’s assessments, although it was a small percentage. Paradoxically, e-quizzes were one of key drivers that led students to engage with pre-lab materials, significantly increasing participation in the overall pre-lab resource-assessment framework. If pre-lab e-quizzes were not present, it would have been difficult to imagine a participation or completion rate as high as 94%. Coupling pre-lab resources with Concept Inventory Testing (Midkiff, Litzinger, and Evans Citation2001) might further strengthen the overall investigation, which can be considered as a potential avenue for future work.

A study by Cann (Citation2016) indicates a unique behavioural pattern found in students undertaking laboratory e-quizzes. It found that irrespective of whether pre-lab e-quizzes are formally assessed or not, student behaviour was driven by the number of questions they were getting correct. Most students seem to retake e-quizzes to get ‘a higher (preferably perfect) score’, showcasing a certain degree of gamification. Despite student comments on time-consumption, difficulty levels and other concerns, the general consensus on pre-lab e-quizzes remained positive in terms of contribution to learning consolidation.

Corroborating with Cann (Citation2016) observations, some teaching staff from the current study found e-quizzes to be one of the most valuable, engaging, and learner-friendly resources alongside videos. They highlighted completion rate of e-quizzes, and their potential in motivating students to engage with other pre-lab materials, while leading to increased student preparedness.

From my experience in the classrooms, I think only maybe two or three students were unable to finish the quiz in time, so that’s three students out of about 200, so it’s definitely reasonable. (Teaching staff D)

The quizzes serve as a means for the students [as] a motivational tool for students to actually watch the videos and look at the pre-lab handouts for the experiments and the computer labs prior to attending them. (Teaching staff I)

It is noteworthy that these comments provide a different, and more positive, perspective on pre-lab e-quizzes compared to student and staff comments discussed immediately above, which indicated a certain level of workload and cognitive-load increase. This demonstrates the internal variability of perceptions of students and staff on this matter, even though in the same cohort, that can lead to a richer understanding of the overall phenomenon. More holistically, the combination of these perceptions point to a relatively larger increase in learning participation achieved via a relatively smaller increase in student workload (and cognitive load). In more generic engineering terminology, a ‘larger output’ through a ‘smaller input’, which directly implies an increase of overall learning efficiency. This exhibits a more effective management of student cognitive load, which is the key objective of this overall research investigation.

Furthermore, the need to continue with e-quizzes was explicitly conveyed, due to their potential to improve student comprehension, learning of threshold thermodynamics concepts and motivation while leading to decreased laziness.

Quizzes and marked tutorials have proven very effective for me, they help to encourage learning for lazy uni[versity] students. (Student L)

Regarding the e-quizzes (…) they were just there really to [encourage] all the students to do the pre-lab material, because we can’t just make it compulsory. We have no way to enforce students to watch it but if they are basically given some motivation of getting some assessment, the most motivated students will watch the videos regardless of assessment. The less motivated students may not watch them and therefore not have a very good experience in the labs, though I think [unintelligible] we gave a small mark but it was enough marks to motivate students. And, I think students did actually learn something by doing the questions. (Teaching staff C)

Similar to the present study, previous studies that have combined pre-laboratory videos and online quizzes have shown that they can decrease student anxiety about upcoming lab classes and effectively manage cognitive load required for students to efficiently learn and retain new concepts (Shelby and Fralish Citation2021; Jolley et al. Citation2016; Stieff et al. Citation2018). For these reasons, higher education institutions have been increasingly promoting such active learning methodologies, in order to encourage students to gain a thorough understanding of threshold concepts, develop critical thinking skills, and thereby effectively manage their cognitive load (Cook and Babon Citation2017; Khawaja et al. Citation2013).

Conclusions and recommendations

This research provides new insights on how pre-lab online learning resources can be utilised to effectively manage cognitive load of engineering students. It can be concluded that pre-lab online resources, in the context of thermodynamics education, lead to increased student preparedness and confidence levels, as validated through perspectives of both students and teaching staff.

Key learnings from this research investigation can be summarised as follows,

  1. Students perceive themselves to be more prepared and confident after using pre-lab learning resources (RQ1-related).

  2. Teaching staff also perceive students to be more prepared and confident after using pre-lab learning resources (RQ2-related).

  3. Based on proxy measurements of student preparedness and confidence levels, pre-lab resources show potential for having a positive effect on managing cognitive load of engineering students (RQ3-related).

These findings answer the Research Questions previously established in this article. This is the first research study, in the context of thermodynamics education, that explores how pre-lab online resources can be utilised to effectively manage student cognitive load. Another contribution of this investigation is exploring how pre-lab learning resources can influence combined domains of student preparedness, confidence levels, and cognitive load, in the context of thermodynamics education for engineering students. As pointed out in previous sections of this article, confounding effects of other moderating variables (in addition to utilised proxies of student preparedness and confidence levels) towards cognitive load should be noted simultaneously. In future research, combining these proxies with subjective rating scales such as CLS, NRS and Nasa TLX could lead to further refined insights about the true capability of pre-lab resources to effectively manage student cognitive load.

There are certainly some further challenges that need to be addressed through future work as previously highlighted in this article. This is an objective of the ongoing longitudinal study, which can allow for further understanding of the conjunction between pre-lab learning, student preparedness, confidence levels and management of cognitive load. Through continued data collection via student attitudinal surveys, staff focus groups, and archival records, this longitudinal investigation can yield broader insights about the influence of pre-lab online learning in engineering education. It may even be possible to map student understanding of threshold theoretical and technical concepts with progression of a course unit for a given semester. Comparative studying of pre-COVID, current-COVID, and post-COVID student performance with and without online pre-lab materials could generate knowledge of unique value, on effectiveness of blended learning under disrupted learning and teaching conditions. This would be helpful to understand the influence of pre-lab learning resources on student performance and assessment frameworks deployed. Further, it is hoped that a longitudinal study would be useful to reveal how pre-lab preparatory materials impact student cognitive load not only for thermodynamics students, but also for other advanced engineering course units such as fluid mechanics, due to analogies of analytical nature.

Moreover, recommendations for future studies on a broader scale include consideration of more innovative methods and tools to enhance students’ pre-lab learning experience; coupling student-survey results and pre-lab e-quizzes with concept inventory testing; qualitatively evaluating deeper student-and-teaching-staff perspectives on cognitive load; and consideration of how different cognitive domains recognised within Bloom’s Taxonomy relate to pre-lab learning.

Acknowledgements

They also express their gratitude to all research participants for their time and contributions.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Additional information

Funding

The authors would like to acknowledge the funding received from the Queensland University of Technology (School of Mechanical, Medical and Process Engineering) and University of the Sunshine Coast (School of Science, Technology and Engineering).

Notes on contributors

C. M. Rathnayaka

Charith Rathnayaka is a Lecturer in the School of Science, Technology and Engineering at the University of the Sunshine Coast (USC) and Senior Fellow of the Higher Education Academy (SFHEA). He is also an Adjunct Lecturer at the Queensland University of Technology (QUT). Charith conducts his research in the area of Computational Mechanics. His work specifically focuses on better understanding the mechanical response of soft matter under different conditions as caused by various natural and man-made circumstances. Charith conducts further research in the area of Engineering Education focusing on how to create innovative learning and teaching experiences through peer programs and digital learning.

J. Ganapathi

Janani Ganapathi is a QMomentum Fellow at Queensland University of Technology’s (QUT) Graduate Research Center, involved in transdisciplinary projects providing qualitative research expertise. Her research interests include open and distance education, STEM education, learning and teaching pedagogies and sustainability in education. She is also a Lecturer-Moderator of QUT’s eGrad School, within QUT’s Graduate Research Education & Development (GRE+D), wherein she manages and teaches higher degree research modules.

S. Kickbusch

Steven Kickbusch is a Learning Designer with QUT, where he oversees the design, development and implementation of engaging student-focused education experiences. Additionally, he is engaged as a sessional academic in the Creative Industries Faculty and the Faculty of Engineering at QUT. His current research is investigating the role of design to support the development of teachers in the STEM disciplines and how designers collaborate with educators to improve student experience.

L. Dawes

Les Dawes is Head of School of Civil and Environmental Engineering and Professor of Engineering Education at Queensland University of Technology in Brisbane. He has over 150 publications with many in STEM and engineering education. Dissemination of his research has translated into practical outcomes with a strong uptake by schools, industry collaboration and inclusion into ongoing curricula.

R. Brown

Richard Brown’s disciplines of expertise are thermodynamics and fluid mechanics. His research interests include applied thermodynamics, internal combustion engines, renewable/alternative energy, smog formation, emissions/dispersion and environmental fluid mechanics. Richard is the Director of the Biofuel Engine Research Facility at QUT and leader of the Environmental Fluid Mechanics Group.

References

  • Abeysekera, Lakmal, and Phillip Dawson. 2015. “Motivation and Cognitive Load in the Flipped Classroom: Definition, Rationale and a Call for Research.” Higher Education Research & Development 34 (1): 1–14. doi:10.1080/07294360.2014.934336.
  • Ahern, Aoife, Caroline Dominguez, Ciaran McNally, John J. O’Sullivan, and Daniela Pedrosa. 2019. “A Literature Review of Critical Thinking in Engineering Education.” Studies in Higher Education 44 (5): 816–828. doi:10.1080/03075079.2019.1586325.
  • Akçayır, Gökçe, and Murat Akçayır. 2018. “The Flipped Classroom: A Review of Its Advantages and Challenges.” Computers & Education 126: 334–345. doi:10.1016/j.compedu.2018.07.021.
  • Akkaraju, Shylaja. 2016. “The Role of Flipped Learning in Managing the Cognitive Load of a Threshold Concept in Physiology.” Journal of Effective Teaching 16 (3): 28–43. https://eric.ed.gov/?id=EJ1125897.
  • ANSYS. 2020. “Ansys® Fluent, Release 19.0”.
  • Ashaver, Doosuur, and Sandra Mwuese Igyuve. 2013. “The Use of Audio-Visual Materials in the Teaching and Learning Processes in Colleges of Education in Benue State-Nigeria.” IOSR Journal of Research & Method in Education 1 (6): 44–55. http://www.iosrjournals.org/iosr-jrme/papers/Vol-1%20Issue-6/G0164455.pdf.
  • Baker, Tony. 2005. “Chemistry: Laboratory Science or Not?” Chemistry in Australia 72 (3): 12–13. doi:10.2555/0314-4240.72.3.1641.
  • Barrie, Simon C., Robert B. Bucat, Mark A. Buntine, Karen Burke da Silva, Geoffrey T. Crisp, Adrian V. George, Ian M. Jamie, et al. 2015. “Development, Evaluation and Use of a Student Experience Survey in Undergraduate Science Laboratories: The Advancing Science by Enhancing Learning in the Laboratory Student Laboratory Learning Experience Survey.” International Journal of Science Education 37 (11): 1795–1814. doi:10.1080/09500693.2015.1052585.
  • Basey, John M., Anastasia P. Maines, Clinton D. Francis, Brett Melbourne, Sarah B. Wise, Rebecca J. Safran, and Pieter T. J. Johnson. 2014. “Impact of Pre-Lab Learning Activities, a Post-Lab Written Report, and Content Reduction on Evolution-Based Learning in an Undergraduate Plant Biodiversity Lab.” Evolution: Education and Outreach 7 (1): 10. doi:10.1186/s12052-014-0010-7.
  • Blackboard. 2020. “QUT Blackboard”.
  • Bortnik, Boris, Natalia Stozhko, Irina Pervukhina, Albina Tchernysheva, and Galina Belysheva. 2017. “Effect of Virtual Analytical Chemistry Laboratory on Enhancing Student Research Skills and Practices.” Research in Learning Technology 25. doi:10.25304/rlt.v25.1968.
  • Box, Melinda C., Cathi L. Dunnagan, Lauren A. S. Hirsh, Clinton R. Cherry, Kayla A. Christianson, Radiance J. Gibson, Michael I. Wolfe, and Maria T. Gallardo-Williams. 2017. “Qualitative and Quantitative Evaluation of Three Types of Student-Generated Videos as Instructional Support in Organic Chemistry Laboratories.” Journal of Chemical Education 94 (2): 164–170. doi:10.1021/acs.jchemed.6b00451.
  • Braun, Virginia, and Victoria Clarke. 2006. “Using Thematic Analysis in Psychology.” Qualitative Research in Psychology 3 (2): 77–101. doi:10.1191/1478088706qp063oa.
  • Bruen, Jennifer, and Niamh Kelly. 2017. “Using a Shared L1 to Reduce Cognitive Overload and Anxiety Levels in the L2 Classroom.” The Language Learning Journal 45 (3): 368–381. doi:10.1080/09571736.2014.908405.
  • Buchner, Josef, Katja Buntins, and Michael Kerres. 2021. “A Systematic Map of Research Characteristics in Studies on Augmented Reality and Cognitive Load.” Computers and Education Open 2: 100036. doi:10.1016/j.caeo.2021.100036.
  • Cann, Alan J. 2016. “Increasing Student Engagement with Practical Classes Through Online Pre-Lab Quizzes.” Journal of Biological Education 50 (1): 101–112. doi:10.1080/00219266.2014.986182.
  • Chaytor, Jennifer L., Mohammad Al Mughalaq, and Hailee Butler. 2017. “Development and Use of Online Prelaboratory Activities in Organic Chemistry to Improve Students’ Laboratory Experience.” Journal of Chemical Education 94 (7): 859–866. doi:10.1021/acs.jchemed.6b00850.
  • Cook, Brian Robert, and Andrea Babon. 2017. “Active Learning Through Online Quizzes: Better Learning and Less (Busy) Work.” Journal of Geography in Higher Education 41 (1): 24–38. doi:10.1080/03098265.2016.1185772.
  • Deane, Andrew, Rebecca Wisner, and Jessica Byram. 2021. “Viewing Pre-Lab Gross Anatomy Demonstration Videos Correlates Positively with Student Performance When Total Dissection Time is Limited by Covid-19 Restrictions.” The FASEB Journal 35 (S1). doi:10.1096/fasebj.2021.35.S1.03644.
  • Dray, Barbara J., Patrick R. Lowenthal, Melissa J. Miszkiewicz, Maria Araceli Ruiz-Primo, and Kelly Marczynski. 2011. “Developing an Instrument to Assess Student Readiness for Online Learning: A Validation Study.” Distance Education 32 (1): 29–47. doi:10.1080/01587919.2011.565496.
  • Eisenhardt, Kathleen M. 1989. “Building Theories from Case Study Research.” Academy of Management Review 14 (4): 532–550. doi:10.5465/amr.1989.4308385.
  • Feisel, Lyle D., and Albert J. Rosa. 2005. “The Role of the Laboratory in Undergraduate Engineering Education.” Journal of Engineering Education 94 (1): 121–130. doi:10.1002/j.2168-9830.2005.tb00833.x.
  • Feldon, David F., Gregory Callan, Stephanie Juth, and Soojeong Jeong. 2019. “Cognitive Load as Motivational Cost.” Educational Psychology Review 31 (2): 319–337. doi:10.1007/s10648-019-09464-6.
  • Feldon, David F., Joana Franco, Jie Chao, James Peugh, and Cathy Maahs-Fladung. 2018. “Self-Efficacy Change Associated with a Cognitive Load-Based Intervention in an Undergraduate Biology Course.” Learning and Instruction 56: 64–72. doi:10.1016/j.learninstruc.2018.04.007.
  • Fitriani, E., M. Paristiowati, and B. Mukarromatunnisa. 2019. “Titration Pre-Lab Demonstration Videos in Basic Chemistry Laboratory Activity: Design and Development.” Journal of Physics: Conference Series 1402 (5): 055047. doi:10.1088/1742-6596/1402/5/055047.
  • Gavas, Rahul D., Soumya Ranjan Tripathy, Debatri Chatterjee, and Aniruddha Sinha. 2018. “Cognitive Load and Metacognitive Confidence Extraction from Pupillary Response.” Cognitive Systems Research 52: 325–334. doi:10.1016/j.cogsys.2018.07.021.
  • Gerjets, Peter, Katharina Scheiter, and Gabriele Cierniak. 2009. “The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories.” Educational Psychology Review 21 (1): 43–54. doi:10.1007/s10648-008-9096-1.
  • Gillie, Martin, Ranim Dahli, Fiona C. Saunders, and Andrew Gibson. 2017. “Use of Rich-Media Resources by Engineering Undergraduates.” European Journal of Engineering Education 42 (6): 1496–1511. doi:10.1080/03043797.2017.1306488.
  • Ginns, Paul, and Jimmie Leppink. 2019. “Special Issue on Cognitive Load Theory: Editorial.” Educational Psychology Review 31 (2): 255–259. doi:10.1007/s10648-019-09474-4.
  • Goedhart, N. S., N. Blignaut-van Westrhenen, C. Moser, and M. B. M. Zweekhorst. 2019. “The Flipped Classroom: Supporting a Diverse Group of Students in Their Learning.” Learning Environments Research 22 (2): 297–310. doi:10.1007/s10984-019-09281-2.
  • Haagsman, Marjolein E., Margot C. Koster, Johannes Boonstra, and Karin Scager. 2021. “Be Prepared! How Pre-Lab Modules Affect Students’ Understanding of Gene Mapping.” Journal of Science Education and Technology 30 (4): 461–470. doi:10.1007/s10956-020-09890-0.
  • Habib, A. 2020. “Pre-Lab Assignment and Discussion to Enhance Students’ Benefit From Lab Sessions.” Paper presented at the 2020 Sixth International Conference on e-Learning (econf), 6–7 December 2020.
  • Hart, Sandra G., and Lowell E. Staveland. 1988. “Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research.” In Advances in Psychology, edited by Peter A. Hancock, and Najmedin Meshkati, 139–183. North-Holland: Elsevier.
  • Huang, Xiaoxia. 2017. “Example-based Learning: Effects of Different Types of Examples on Student Performance, Cognitive Load and Self-Efficacy in a Statistical Learning Task.” Interactive Learning Environments 25 (3): 283–294. doi:10.1080/10494820.2015.1121154.
  • Johnstone, A. H. 1997. “Chemistry Teaching-Science or Alchemy? 1996 Brasted Lecture.” Journal of Chemical Education 74 (3): 262. doi:10.1021/ed074p262.
  • Johnstone, A. H., R. J. Sleet, and J. F. Vianna. 1994. “An Information Processing Model of Learning: Its Application to an Undergraduate Laboratory Course in Chemistry.” Studies in Higher Education 19 (1): 77–87. doi:10.1080/03075079412331382163.
  • Jolley, Dianne F., Stephen R. Wilson, Celine Kelso, Glennys O’Brien, and Claire E. Mason. 2016. “Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes.” Journal of Chemical Education 93 (11): 1855–1862. doi:10.1021/acs.jchemed.6b00266.
  • Jones, Susan M, and Ashley Edwards. 2010. “Online Pre-Laboratory Exercises Enhance Student Preparedness for First Year Biology Practical Classes.” International Journal of Innovation in Science 18 (2): 1–9. https://openjournals.library.sydney.edu.au/CAL/article/view/4641.
  • Josephsen, Jens, and Agnieszka Kosminska Kristensen. 2006. “Simulation of Laboratory Assignments to Support Students’ Learning of Introductory Inorganic Chemistry.” Chemistry Education Research and Practice 7 (4): 266–279. doi:10.1039/B6RP90013E.
  • Khawaja, M. Asif, Gangadhara B. Prusty, Robin A. J. Ford, Nadine Marcus, and Carol Russell. 2013. “Can More Become Less? Effects of an Intensive Assessment Environment on Students’ Learning Performance.” European Journal of Engineering Education 38 (6): 631–651. doi:10.1080/03043797.2013.834295.
  • Kirschner, Paul A., Paul Ayres, and Paul Chandler. 2011. “Contemporary Cognitive Load Theory Research: The Good, the Bad and the Ugly.” Computers in Human Behavior 27 (1): 99–105. doi:10.1016/j.chb.2010.06.025.
  • Kirschner, Paul A., John Sweller, and Richard E. Clark. 2006. “Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching.” Educational Psychologist 41 (2): 75–86. doi:10.1207/s15326985ep4102_1.
  • Klepsch, Melina, Florian Schmitz, and Tina Seufert. 2017. “Development and Validation of Two Instruments Measuring Intrinsic, Extraneous, and Germane Cognitive Load.” Frontiers in Psychology 8: 1997. doi:10.3389/fpsyg.2017.01997.
  • Lal, Sulakshana, Anthony D. Lucey, Euan D. Lindsay, David F. Treagust, John M. Long, Mauro Mocerino, and Marjan G. Zadnik. 2020. “Student Perceptions of Instruction Sheets in Face-to-Face and Remotely-Operated Engineering Laboratory Learning.” European Journal of Engineering Education 45 (4): 491–515. doi:10.1080/03043797.2019.1654433.
  • Lamichhane, R., and A. Maltese. 2019. “Enhancing Students’ Laboratory Experiences in Undergraduate Chemistry.” Technology Integration in Chemistry Education and Research (TICER), 83–106. American Chemical Society.
  • Leppink, Jimmie, Fred Paas, Cees PM Van der Vleuten, Tamara Van Gog, and Jeroen JG Van Merriënboer. 2013. “Development of an Instrument for Measuring Different Types of Cognitive Load.” Behavior Research Methods 45: 1058–1072. doi:10.3758/s13428-013-0334-1.
  • Likourezos, Vicki, and Slava Kalyuga. 2017. “Instruction-First and Problem-Solving-First Approaches: Alternative Pathways to Learning Complex Tasks.” Instructional Science 45 (2): 195–219. doi:10.1007/s11251-016-9399-4.
  • Lombard, Matthew, Jennifer Snyder-Duch, and Cheryl Campanella Bracken. 2002. “Content Analysis in Mass Communication: Assessment and Reporting of Intercoder Reliability.” Human Communication Research 28 (4): 587–604. doi:10.1111/j.1468-2958.2002.tb00826.x.
  • Lucas, Bill, Guy Claxton, and Janet Hanson. 2014. Thinking Like an Engineer: Implications for the Education System.
  • Martín-Blas, Teresa, Luis Seidel, and Ana Serrano-Fernández. 2010. “Enhancing Force Concept Inventory Diagnostics to Identify Dominant Misconceptions in First-Year Engineering Physics.” European Journal of Engineering Education 35 (6): 597–606. doi:10.1080/03043797.2010.497552.
  • Microsoft-Corporation. 2020. “Microsoft Forms”.
  • Midkiff, K Clark, Thomas A Litzinger, and D. L. Evans. 2001. “Development of Engineering Thermodynamics Concept Inventory Instruments.” Paper presented at the 31st Annual frontiers in education conference. Impact on Engineering and Science Education, Conference Proceedings (Cat. No. 01CH37193).
  • Moos, Daniel C., and Debra Pitton. 2014. “Student Teacher Challenges: Using the Cognitive Load Theory as an Explanatory Lens.” Teaching Education 25 (2): 127–141. doi:10.1080/10476210.2012.754869.
  • Novak, Joseph D. 1993. “Human Constructivism: A Unification of Psychological and Epistemological Phenomena in Meaning Making.” International Journal of Personal Construct Psychology 6 (2): 167–193. doi:10.1080/08936039308404338.
  • Novak, Joseph D. 2010. Learning, Creating, and Using Knowledge: Concept Maps as Facilitative Tools in Schools and Corporations. New York: Routledge.
  • O'Flaherty, Jacqueline, and Craig Phillips. 2015. “The Use of Flipped Classrooms in Higher Education: A Scoping Review.” The Internet and Higher Education 25: 85–95. doi:10.1016/j.iheduc.2015.02.002.
  • Paas, Fred, and Paul Ayres. 2014. “Cognitive Load Theory: A Broader View on the Role of Memory in Learning and Education.” Educational Psychology Review 26 (2): 191–195. doi:10.1007/s10648-014-9263-5.
  • Paas, Fred, Alexander Renkl, and John Sweller. 2003. “Cognitive Load Theory and Instructional Design: Recent Developments.” Educational Psychologist 38 (1): 1–4. doi:10.1207/S15326985EP3801_1.
  • Paas, Fred G. W. C., and Jeroen J. G. Van Merriënboer. 1993. “The Efficiency of Instructional Conditions: An Approach to Combine Mental Effort and Performance Measures.” Human Factors 35 (4): 737–743. doi:10.1177/001872089303500412.
  • Piaget, Jean. 1952. “The Origins of Intelligence in Children.” In The Origins of Intelligence in Children, edited by Margaret Cook, 25–36. New York, NY: W W Norton & Co.
  • QSR-International. 2020. “NVivo”.
  • Rahman, Ataur. 2017. “A Blended Learning Approach to Teach Fluid Mechanics in Engineering.” European Journal of Engineering Education 42 (3): 252–259. doi:10.1080/03043797.2016.1153044.
  • Ramsden, Jeremy. 2012. “The Differences Between Engineering and Science.” Measurement and Control 45 (5): 145–146. doi:10.1177/002029401204500503.
  • Reid, Norman, and Iqbal Shah. 2007. “The Role of Laboratory Work in University Chemistry.” Chemistry Education Research and Practice 8 (2): 172–185. doi:10.1039/B5RP90026C.
  • Rodgers, T. L., N. Cheema, S. Vasanth, A. Jamshed, A. Alfutimie, and P. J. Scully. 2020. “Developing Pre-Laboratory Videos for Enhancing Student Preparedness.” European Journal of Engineering Education 45 (2): 292–304. doi:10.1080/03043797.2019.1593322.
  • Seery, Michael K., Hendra Y. Agustian, Euan D. Doidge, Maciej M. Kucharski, Helen M. O’Connor, and Amy Price. 2017. “Developing Laboratory Skills by Incorporating Peer-Review and Digital Badges.” Chemistry Education Research and Practice 18 (3): 403–419. doi:10.1039/C7RP00003K.
  • Shelby, Shameka J., and Zachary D. Fralish. 2021. “Using Edpuzzle to Improve Student Experience and Performance in the Biochemistry Laboratory.” Biochemistry and Molecular Biology Education 49 (4): 529–534. doi:10.1002/bmb.21494.
  • Smallhorn, Masha. 2017. “The Flipped Classroom: A Learning Model to Increase Student Engagement Not Academic Achievement.” Student Success. doi:10.3316/informit.593366988343831.
  • Sohrabi, Babak, and Hamideh Iraj. 2016. “Implementing Flipped Classroom Using Digital Media: A Comparison of Two Demographically Different Groups Perceptions.” Computers in Human Behavior 60: 514–524. doi:10.1016/j.chb.2016.02.056.
  • Stieff, Mike, Stephanie M. Werner, Bill Fink, and Dianne Meador. 2018. “Online Prelaboratory Videos Improve Student Performance in the General Chemistry Laboratory.” Journal of Chemical Education 95 (8): 1260–1266. doi:10.1021/acs.jchemed.8b00109.
  • Sweller, John, Jeroen J. G. Van Merrienboer, and Fred G. W. C. Paas. 1998. “Cognitive Architecture and Instructional Design.” Educational Psychology Review 10 (3): 251–296. doi:10.1023/a:1022193728205.
  • Teigen, Karl Halvor. 1994. “Yerkes-Dodson: A Law for All Seasons.” Theory & Psychology 4 (4): 525–547. doi:10.1177/0959354394044004.
  • Thees, Michael, Sebastian Kapp, Kristin Altmeyer, Sarah Malone, Roland Brünken, and Jochen Kuhn. 2021. “Comparing Two Subjective Rating Scales Assessing Cognitive Load During Technology-Enhanced STEM Laboratory Courses.” Frontiers in Education 6: 1–16. doi:10.3389/feduc.2021.705551.
  • Tran, L. Q., Y. Sun, R. Guan, J. Saeed, L. Wang, and P. J. Radcliffe. 2019. “Development and Outcomes of Teaching PID Control in Classroom with Hands on Learning Experience.” Paper Presented at the 2019 IEEE International Conference on Industrial Technology (ICIT), 13–15 February.
  • Tune, Johnathan D., Michael Sturek, and David P. Basile. 2013. “Flipped Classroom Model Improves Graduate Student Performance in Cardiovascular, Respiratory, and Renal Physiology.” Advances in Physiology Education 37 (4): 316–320. doi:10.1152/advan.00091.2013.
  • Ulloa Salazar, Gemalli, and Claudio Díaz Larenas. 2018. “Using an Audiovisual Materials-Based Teaching Strategy to Improve EFL Young Learners’ Understanding of Instructions.” How 25 (2): 91–112. doi:10.19183/how.25.2.419.
  • van Gog, Tamara, and Fred Paas. 2008. “Instructional Efficiency: Revisiting the Original Construct in Educational Research.” Educational Psychologist 43 (1): 16–26. doi:10.1080/00461520701756248.
  • van Raalte, Lisa, and Rachel Boulay. 2013. “Designing and Evaluating a Scientific Training Program and Virtual Learning Materials.” The International Journal of Design Education 7 (2): 1–10. doi:10.18848/2325-128x/cgp/v07i02/38439.
  • Winberg, T. Mikael, and C. Anders R. Berg. 2007. “Students’ Cognitive Focus During a Chemistry Laboratory Exercise: Effects of a Computer-Simulated Prelab.” Journal of Research in Science Teaching 44 (8): 1108–1133. doi:10.1002/tea.20217.
  • Yerkes, Robert Mearns, and John D Dodson. 1908. “The Relation of Strength of Stimulus to Rapidity of Habit-Formation.” Journal of Comparative Neurology and Psychology 18: 459–482. doi:10.1002/cne.920180503.

Appendices

Appendix1: Thematic coding and visualisation of research data

Figure A1. A pie chart reflecting top recurring themes emerging from focus group data.

Figure A1. A pie chart reflecting top recurring themes emerging from focus group data.

Figure A2. A visual map connecting key themes and their relationships.

Figure A2. A visual map connecting key themes and their relationships.

Appendix 2: Example student survey questions (via Microsoft Forms)