1,237
Views
4
CrossRef citations to date
0
Altmetric
Articles

Laboratory learning objectives: ranking objectives across the cognitive, psychomotor and affective domains within engineering

Pages 454-473 | Received 07 Sep 2022, Accepted 08 Aug 2023, Published online: 22 Aug 2023

ABSTRACT

The literature on laboratory objectives in engineering education research is scattered and inconsistent. Systematic literature reviews identified the need for better understanding. This paper ranks the laboratory learning objectives across the cognitive, psychomotor and affective domains to improve scaffolding. It provides an opportunity for reflection, a pathway to confirm assessment alignment, and opens future research areas. To accomplish this, the Laboratory Learning Objectives Measurement (LLOM) instrument is used to survey 160 academics from around the world representing 18 engineering disciplines. The results suggest that the collective ranking order does represent a framework that can be used broadly. However, for greater alignment with consensus thinking, discipline rankings should be used. The cognitive domain was deemed the most important. These results provide the community’s opinion and may not necessarily be best practice, providing an opportunity for reflection.

1. Introduction

Laboratory learning is defined as any form of learning that takes place in any space (e.g. physical or virtual) where students can observe, practice and experiment (Ka Yuk Chan Citation2012). Systematic review after systematic review, studies state the importance of laboratory learning to disciplines such as engineering (Nikolic, Ros, et al. Citation2021; Reeves and Crippen Citation2021) and science (Brinson Citation2015; Faulconer and Gruss Citation2018). However, such reviews raise the issue that some learning objectives or outcomes in the papers they synthesise are considered implicitly (not-assessed directly) and/or not holistic (does not consider all learning being undertaken). Moreover, many research studies concentrate more on understanding the benefits of the innovation of their work (e.g. finding the benefits of a new tool, experiment or approach) than a solid, holistic exploration of learning. This presents an opportunity for researchers to address this gap by working together to understand laboratory learning better.

Pioneering such an academic collaboration was an effort to define the engineering laboratory objectives via a colloquy in 2002 (Feisel and Rosa Citation2005). At this colloquy, thirteen laboratory objectives were defined, providing a platform of learning opportunities for academics to implement and to compare against when developing innovations or comparing laboratory modes. The guiding principle for that work is that if you don’t know where to go, you won’t know which road to take, and you won’t know if you have arrived. In the following 20 years, researchers have built upon such initiatives and moved towards studies that try and measure laboratory learning in some way (Nikolic, Ros, et al. Citation2021). Discussion of the 13 laboratory objectives in engineering laboratory literature is commonplace with reference to approximately 2000 citations, such as those discussed in papers (Garcia-Loro et al. Citation2021; Lal et al. Citation2017; Stefanovic et al. Citation2015). Developing a greater appreciation of laboratory assessment is very important because teaching, learning and assessment are inextricably linked, and assessment is the most significant motivator of learning (Hargreaves Citation1997).

While a greater discussion has occurred within laboratory education research, explicitly defining objectives associated with a particular research study remains scattered, and assessment analysis remains concentrated within highly-ranked engineering education journals (Nikolic, Ros, et al. Citation2021). Prominent leaders have reiterated that more needs to be done to understand laboratory assessment (Loui Citation2016). As outlined above, laboratory objectives and laboratory assessment are inextricably linked. Therefore, to develop a deep understanding of laboratory assessments, one must first understand the objectives that need to be assessed.

An extension of the thirteen laboratory learning objectives is the Laboratory Learning Objectives Measurement (LLOM) instrument (Nikolic, Suesse, et al. Citation2021), used to gain a more holistic view of learning in the engineering laboratory. The instrument is designed so that the context of any engineering laboratory setting can be applied (by changing keywords in the template) for measurement purposes. All skills listed are not expected to be achieved in any one laboratory setting. The LLOM instrument blends the thirteen laboratory learning objectives with the three learning domains associated with Bloom’s Taxonomy (Anderson et al. Citation2001), creating a holistic template that measures learning in an engineering laboratory. Bloom’s Taxonomy considers that learning occurs and can be scaffolded across three interconnecting domains – the cognitive, psychomotor and affective. Further insights into Bloom’s Taxonomy are discussed in Section II. It is important to note that while a separation exists, learning domains cannot be isolated from each other because almost all learning activities involve more than one domain (Bott Citation1996; Salim et al. Citation2013).

The blended approach used in LLOM is helpful because, in a traditional (hands-on) engineering laboratory, students must undertake activities like applying, analysing and evaluating information (cognitive); imitate, manipulate and articulate with their hands (psychomotor); and attend, respond and value with their presence (affective). Studies attempting to understand holistic learning in their laboratory innovation, such as (Elawady and Tolba Citation2009; Lindsay and Good Citation2005), try to apply or at least consider the multi-domain implications. However, as outlined in Nikolic, Ros et al. (Citation2021), the objectives applied experimentally may not be consistent, leading to difficulties in comparing the outcomes from study to study. Details of the LLOM instrument will be discussed further in the following section.

Further reasoning why a holistic instrument is needed is because research suggests that different laboratory access modes (traditional, virtual, remote etc.) have different learning strengths and weaknesses, such as those outlined in Elawady and Tolba (Citation2009) and Lindsay and Good (Citation2005). A better understanding of learning can allow the best pedagogy to be applied, e.g. the best laboratory mode (May et al. Citation2023). It has also been shown that non-traditional access modes have equal or better cognitive learning outcomes than traditional formats (Balakrishnan and Woods Citation2013; May Citation2021; Steger et al. Citation2020). However, when considering assessment-backed findings, almost all related research is concentrated only on cognitive objectives (Nikolic, Ros, et al. Citation2021). In contrast, student perception of learning indicates that it occurs across all three domains (Nikolic, Suesse, et al. Citation2021), suggesting the academic community is missing or underestimating all learning being attained. This shows a significant research gap across psychomotor and affective laboratory learning and is thus part of the warrant for this study. The first step in exploring such a limitation is understanding and reflecting on what learning is important. This importance is further compounded by the rapid transition to different learning modes caused by COVID-19 and the need for the academic community to understand the full impact of such changes (Behera et al. Citation2023). More recently, the capabilities of ChatGPT have been shown to impact assessment integrity related to written laboratory work, highlighting a further need to reflect on the objectives assessed (Nikolic et al. Citation2023). The same study also suggests that laboratory learning can become even more important in an artificially intelligent world.

Therefore, this study aims to advance engineering education research by answering the overarching research question, what are the most important engineering laboratory learning objectives across the cognitive, psychomotor and affective learning domains? By exploring this question, the academic engineering education community will gain insights as to which objectives are deemed most important; they will be able to reflect and determine if the rankings are justified; and will be able to reflect if the assigned assessments are assessing skills in important areas. This should provide a pathway for more targeted research regarding understanding laboratory learning. This paper scaffolds upon two previous studies. The first study explored if laboratory rankings would differ across continents (Nikolic et al. Citation2022a). The analysis showed that across the cognitive and psychomotor domains, the rankings across locations were very uniform, with the minor differences occurring across the affective domain. The second study (Nikolic et al. Citation2022b) scaffolded upon the first by exploring rankings within one of the studied countries. The analysis suggested that the engineering discipline could influence ranking order. Therefore, this study will build upon the scaffold by answering the research question is the ranking of importance different across disciplines? The answer to this question will provide a common framework for developing a better, holistic understanding of laboratory learning and a pathway to connect laboratory activity to more targeted assessment.

2. The LLOM instrument

While details of the LLOM instrument (Nikolic, Suesse, et al. Citation2021) can be found in other papers, it is important to cover some of the basics to understand the ranking order and the implications of the findings. The LLOM instrument aims to help academics holistically reflect on the learning being undertaken in their engineering laboratory classes, especially those seeking to publish in engineering education literature. As was found in a systematic literature review (Nikolic, Ros, et al. Citation2021), engineering laboratory studies focus mainly on the perceived strengths of the reported innovation/implementation and miss the opportunity to explore and report on learning at a holistic level. This has resulted in knowledge being developed that provides assessment-backed insights on only a small subset of cognitive skills. Psychomotor and affective skills, if measured, are most likely measured via student perception only. The LLOM instrument offers a platform for a wide variety of research, especially a list of scaffolded studies by the authors, that will enable a better understanding of the types of learning occurring in the laboratory across different experiments and modes of delivery.

As the introduction mentions, the LLOM instrument blends the 13 objectives listed in Feisel and Rosa (Citation2005) with the revised version of Bloom’s Taxonomy outlined in Anderson, et al. (Citation2001). The original version was published in 1956 and was revised in 2001 to support the systematic rationale of its construction. The premise behind the taxonomy is its aid in selecting, organising, or evaluating almost any set of instructional activities. Key to the decision to synthesise the 13 objectives with Bloom’s Taxonomy is its philosophical leaning towards skills over content. The taxonomy has been heavily critiqued (Morshead Citation1965), resulting in variations of names of skills, hierarchy, structure and new classifications (Simpson Citation1972). It would be expected that the LLOM instrument would also be critiqued and enhanced over time.

The 13 objectives are correlated to items across the cognitive, psychomotor and affective domains as defined by the taxonomy. The objectives were classified by Feisel and Rosa (Citation2005) into three domains, which the LLOM instrument has worked with, expanded and further refined. The objectives regarding instrumentation, models, experiment, data analysis and design were all classified in the cognitive domain. The psychomotor (manipulation) and sensory awareness objectives were classified in the psychomotor domain. The learning from failure, creativity, safety, communication, teamwork and ethics objectives were classified as affective. They, too, recognised that almost all learning activities involve more than one domain (Bott Citation1996; Feisel and Rosa Citation2005; Salim et al. Citation2013). A visual representation of this blending and cross-over can be seen in . Possible alignment of items in learning domains beyond Bloom’s Taxonomy is out of scope and not considered.

Figure 1. LLOM – Blending Laboratory Objectives (Red) with Defining Skills from the Blooms Taxonomy (Black).

Figure 1. LLOM – Blending Laboratory Objectives (Red) with Defining Skills from the Blooms Taxonomy (Black).

The learning domains within Bloom’s Taxonomy are ranked from the highest order (represented by 1 below) to the lowest order. The higher-order skills are generally considered to be built upon mastering the lower-level skills, providing a scaffolded process to facilitate learning. It is important to note that alternatives to the hierarchy structure have been suggested (Atkinson Citation2013), resulting in a less formal mastery structure. represents the three domains, the defining skills (highest to lowest) and related verbs (Anderson et al. Citation2001; Atkinson Citation2013). Ideally, we try and facilitate students to reach the highest order.

Defining Skills and Related Verbs in the Three Domains:

Cognitive:

  1. Creating – design, propose, modify, develop.

  2. Evaluating – assess, review, judge, appraise.

  3. Analysing – compare, test, measure, contrast, infer, plot.

  4. Applying – execute, solve, prepare, show.

  5. Understanding – explain, summarise, interpret, report.

  6. Remembering – identify, describe, list.

Psychomotor:

  1. Embody – design, self-manage, project-manage.

  2. Articulate – construct, combine, solve, develop.

  3. Perfect – demonstrate, complete, be precise, control.

  4. Manipulate – build, perform, execute, manipulate.

  5. Imitate – follow, repeat, copy, identify, match.

Affective:

  1. Characterising – break down situations and respond accordingly based on values, code of personal behaviour.

  2. Organising – can compare and contrast values & choices.

  3. Valuing – motivated to invest.

  4. Responding – willingly participating, ready to respond.

  5. Receiving/Attending – gives attention by choice, open to the experience.

The skills listed are meshed with the 13 laboratory objectives (Feisel and Rosa Citation2005) and have been written in a format allowing universal application across different engineering courses and disciplines. Course refers to the collective components of a subject, such as lectures, tutorials, laboratories, workshops etc. Discipline refers to the branch of engineering studies, such as electrical or civil engineering. The instrument can be used through this format on older, current or new laboratory experimental implementations. It translates the objectives from the work of Feisel and Rosa (Citation2005) into actionable items that can be measured within a particular laboratory context (the context of the course or experimental purpose). This allows for a more holistic understanding of the learning taking place that also provides a structure that can be used to better understand the learning value of laboratory innovations/implementations or to compare across laboratory modes and/or disciplines.

Keywords within the text of an objective have been written in italics that allow modification to match the required context or discipline. This context-based modification is the usefulness of the instrument. Any related word can be used; not just the sample words given for context. For example; objective P1 written as ‘Correctly conduct an experiment on [course equipment/ software name – e.g.; power systems]? could be modified to be ‘Correctly conduct an experiment on control systems’ or ‘Correctly conduct an experiment on hydraulics’. As a further example; new innovative laboratory environments; such as a Makerspace or FabLab (Soomro, Casakin, and Georgiev Citation2022); can be adjusted to be used with the instrument. For example; some of the objectives could be modified to become P1 ‘Correctly conduct an experiment on 3D Printing’; P2S ‘Select appropriate commands and navigate interface to develop a 3D model for printing’ and A6 ‘Learn from failure when 3D printed output is not as expected’. Successful use of the instrument is for the course coordinator or researcher to consider all items in the instrument; regardless of whether they are purposely planned within the designed experimentation.

The objectives listed in Nikolic, Suesse, et al. (Citation2021) and the relevant learning domain level listed above are outlined in . Grouping of the items was confirmed by factor analysis (Kaiser rule, parallel analysis, optimal coordinates and acceleration factor) within that study using student data. The groupings show that the cognitive domain has two distinct subgroups, items C1–C7 with an analytical focus, and C8 – C9 with a writing focus.

TABLE 1. Laboratory learning objectives measurement items with associated taxonomy descriptor & level.

Note that the psychomotor domain has two objectives, P2 and P6, with a software (S) and hardware (H) version to cater to software or hardware focussed courses/disciplines. Most disciplines are expected to engage with at least some form of software, such as simulation or CAD.

, the column Laboratory Objective Descriptor, shows that some learning objective items from Feisel and Rosa (Feisel and Rosa Citation2005) overlap domains. This supports the overlapping structure shown in and discussed previously.

At first glance, some of the connections between objectives and domain may seem out of place, and this can usually be overcome by exploring the taxonomy verbs. Some common misconceptions come from items such as P5, related to fault finding. Many academics consider fault-finding a dominant cognitive process. However, studies such as Nikolic, Vial, et al. (Citation2015) showcase how teaching assistants with the required cognitive knowledge fail miserably when they don’t know how to physically use and manipulate the tools necessary to facilitate the fault-finding process. As shown in , domain crossover occurs and is natural.

Most misconceptions occur across the affective domain, so we paid more attention to them. The following provides examples of how the objectives sit within the terminology of the affective domain. Teamwork (A1) can be considered as characterisation. By working in a group, learners must balance their own values with the team's values to complete the laboratory activities, prioritise tasks, and practice teamwork (U. O. Waterloo Citation2022). Communication (A2) can be considered as responding in which students speak and actively respond to others, and in this case, the communication is set on topics related to the laboratory. Independent work (A3) can be considered as receiving/attending as it demonstrates that through engaging in laboratory activities, the student gives attention by choice and is open to the experience. Ethical issues (A4) can be considered as organising because it requires the student to compare value systems and understand the evidence behind values. Creativity (A5) can be considered as valuing because the effort required to be creative demonstrates the students’ motivation to invest. Learning from failure (A6) can also be regarded as characterising because it reflects the code of personal behaviour, e.g. will the student give up, become disruptive, seek spoon-feeding from the demonstrator or appreciate the value in the experience? Motivation (A7) can also be considered as valuing; it aims to demonstrate the student’s motivation to invest in the experience.

3. Research implementation

A multi-institution and multi-disciplinary research team was assembled to investigate the research question. Members of the team reached out in 2021 to their university, research and professional contacts to answer an online survey using Qualtrics that required participants to rank in order of importance (1 = highest ranked) the multi-domain objectives as listed in the Laboratory Learning Objectives Measurement (LLOM) instrument as outlined in Nikolic, Suesse, et al. (Citation2021). This was completed with UOW ethics approval number 2021/252. The survey was conducted anonymously. Before commencing the survey, interested parties were given full details on the research study, with participants providing informed consent to proceed. Approximately 3000 academics from all continents were invited to participate, with 219 survey commencements and 160 completions. Response distribution was 113 from Australasia, 25 from Europe, 12 from Asia, 9 from North America and 1 from South America. While Australasian responses dominate, an earlier study (Nikolic et al. Citation2022a) found that across the board, statistical differences and rankings were minimal across the cognitive and psychomotor domains but evident across the affective domain.

Discipline response distribution was 2 Aeronautical, 7 Biomedical, 17 Chemical, 14 Civil, 17 Computer, 22 Electrical, 19 Electronics, 2 Industrial/Process, 10 Materials, 21 Mechanical, 8 Mechatronics, 1 Mining, 4 Other, 10 Software and 6 Telecommunications. For analysis, disciplines with at least 10 responses were analysed separately.

In terms of laboratory teaching experience, 23% of respondents had less than five years of teaching experience, 20% had between 5- and 10-years of experience, and 57% had 10 or more years of experience. Only 3% of respondents were female. With a 97% male response rate, this imbalance reflects the ratios found in the engineering departments of many Western universities, especially in Australasia.

Participants were required to rank the objectives from most important (ranking = 1) to least important (ranking 6 or 7, depending on the domain). Once this was completed, participants were required to rank a list that comprised the highest and lowest objective from each domain.

A fixed initial ranking was used to determine if any rankings remained unchanged, based on the order listed in . None of the rankings were left in the default state for the responses analysed.

This work is not without its limitations. This research is based on a self-selection process and may represent the ideas of academics more engaged and influenced by engineering education research. While instructions were provided on how to interpret the LLOM template, there is no guarantee that all items and the application of the template was understood, and they could pick out the key terms from the context. While approximately 3000 academics were invited, only a small number completed the survey in full. Such a small response rate, based on previous experience, is quite common. This is especially due to the high cognitive load on participants to consider the ranking order carefully, especially at a time when academic staff across the world were enduring very high workload pressures related to COVID. While most responses were from Australasia, the earlier analysis in Nikolic et al. (Citation2022a) suggests that the impact is negligible. The rankings represent the opinions of academics and do not represent the perceptions of students and industry, which may have different priorities. This will be explored in a follow-up study.

4. Results

The platform R (v4.05) was used for the statistical analysis with the results shown in (cognitive), (psychomotor) and (affective). The data was analysed collectively and across disciplines with at least 10 responses. Disciplines with less than 10 responses were aggregated to the ‘other’ group. Rankings were determined through the use of averages. The lower the number, the more academics ranked the objective as being more important than objectives with a higher average. In brackets, the 95% confidence interval (CI) is shown. When two confidence intervals do not overlap, a statistically significant difference in mean values can be concluded. Such differences to the collective are highlighted in green. For example, for C3, the collective has a confidence interval (3.69, 4.43), and software engineering has a confidence interval (1.24, 2.96). As the intervals do not overlap (1.24 and 2.96 are both smaller than 3.69 and 4.43), and the lower endpoint, 3.69 is larger than the higher endpoint of the other (which is 2.96), a statistically significant difference in mean values can be concluded.

TABLE 2. Learning objectives cognitive domain (averages with 95% confidence interval) and ranking order.

The value in the last column shows the p-value of the non-parametric equivalent test of ANOVA, the Kruskal–Wallis test, to account for non-Gaussian distributed data, which is also best suited to the small sample size. The p-value is used to test for mean differences across groups; this examines whether, for a particular objective (e.g. C1), the mean responses are different across the disciplines, i.e. if a p-value is <5%, then responses differ across disciplines for that question, otherwise not.

A multivariate analysis of variance (MANOVA) was applied to determine if there is a statistical difference in means across majors for any particular objective. The MANOVA p-value for is 0.049, <0.001, 0.030 and 0.060.

TABLE 3. Learning objectives psychomotor domain (averages with 95% confidence interval) and ranking order.

TABLE 4. Learning objectives affective domain (averages with 95% confidence interval) and ranking order.

TABLE 5. Learning objectives all domains highest and lowest importance (averages with 95% confidence interval) and ranking order.

Since there are no classic non-parametric alternatives to the MANOVA test (Kuskal-Wallis is the non-parametric alternative to ANOVA), a non-parametric Bootstrapping was applied to the MANOVA statistic (using the R package MANOVA.RM Friedrich, Konietschke, and Pauly Citation2019), obtaining valid p-values even if the data is not normally distributed. The obtained non-parametric MANOVA p-values are 0.013 (), < 0.001 (), 0.077 (), and 0.084 (). This indicates that overall, the mean responses differ across disciplines for some objectives for and .

Each table also provides a visual representation of the objectives in ranking order. Visual representations can help develop a better understanding of data. Colour coding is used to show how the collective ranking differs across the disciplines. For example, in , C1 is light blue. The different ranking of C1 for each discipline can be easily observed in the table by following the colour trend.

shows the averages from ranking the highest and lowest objectives from each domain. This aimed to explore the ranking of the three domains themselves.

compares the laboratory objective ranking for the collective group and the taxonomy order ranking. The learning objectives in ranking order for the collective group have been displayed based on the taxonomy level they represent. For example, C1 represents Bloom’s Taxonomies cognitive domain level 5, as indicated in . As the higher order skills require the mastery of the lower ranking items, it is assumed that the learning objectives linked to higher order skills should receive a higher ranking. That is, some correlation between the taxonomy level and the importance of the learning objective is expected.

TABLE 6. Comparison between laboratory objective ranking for the collective group and blooms taxonomy (BT) order ranking.

The results of the research questions are outlined in the upcoming discussion section.

5. Discussion

Each domain is discussed separately below.

5.1. Cognitive domain

As seen in , collectively, the averages range from 3.11 to 7.29 in an order from most important to least of C1, C2, C7, C3, C5, C4, C6, C8, C9. There is little variation across the disciplines when looking at average values, with only a few objectives being statistically different, as discussed below.

Collectively, the most important cognitive ranked items were C1, C2 and C7, reflecting understanding, design/modelling and analysis, respectively. However, materials engineering and the electrical-related disciplines (computer, electrical, electronics, and software) show a higher value towards C3 (using engineering tools to solve problems) than analysis (C7). C3 ranked fourth in the collective. For electronics and software engineering, the average for C3 is statistically different compared to the collective. The four highly ranked objectives are not surprising, as they represent what is typically seen in engineering education literature. This is because laboratory innovation and laboratory comparison research are centred on observations and assessments that try and measure these cognitive objectives (Nikolic, Ros, et al. Citation2021). Example studies include using pre/post-tests and/or exam data (Gamo Citation2019; Kollöffel and de Jong Citation2013; Shyr Citation2010).

For some time, it has been claimed that a fundamental purpose of laboratory work has been for students to understand the relationship between theories and models, and objects and events, to develop holistic, conceptual knowledge (Bernhard Citation2010). Students develop understanding by learning something practising engineers are assumed to already know (Feisel and Rosa Citation2005). Therefore, it is not surprising to find that C1 and C2 are closely connected across most of the disciplines. Interestingly, through C7, some high-level quantification of C1 and C2 can occur via analysis. Learning to use engineering tools (C3) is also important because it’s through the use of tools that we learn to work more efficiently (e.g. calculation speed), accurately (e.g. finer measurements) and make further advancements (e.g. AI opens new opportunities). The academic community's challenge is understanding how the tools constrain and enable what is learned (Bernhard Citation2018).

The least important cognitive ranked items were C6 (safety), C8 (summarising), and C9 (logbook/report writing) for the collective. Objective C9 is last or second last for every discipline apart from material engineering (statistically different). The low ranking contrasts with laboratory practices where reports and logbook-related assessments dominate (Nikolic, Ros, et al. Citation2021). This dominance can be connected to C1, the highest-ranking objective because it provides a platform for students to communicate and demonstrate their understanding (Masoud Citation2017). Academics and students believe that improvements in technical writing are needed, and lab reports can play a role in achieving that (St Clair, Kim, and Riley Citation2021; Wright, Slaboch, and Jamshidi Citation2022). Then the question needs to be asked, why is the ranking so low? It could be that report writing is associated with a lower-order skill in Bloom’s Taxonomy, opening an area for further investigation. As reported in Nikolic et al. (Citation2023), the writing capabilities of artificial intelligence (AI) will influence changes. AI may be used to support students in producing laboratory reports of higher quality, or assessment integrity risks may see lab reports transition to other assessment types. Assessment integrity risks may cement laboratory reports as a low-ranking objective. Time will tell.

While C6 (safety) is not statistically different across most disciplines by average value (it is for software engineering), it is valued more importantly in civil and materials in ranking order. This could be because of explicit engagement; for example, materials engineering students may be more exposed to Materials Safety Data Sheets (MSDS) and civil engineers to more dangerous working environments. It could have also been expected to be higher in chemical engineering or civil-related industries such as mining which are associated with higher fatalities (Gutiérrez Ortiz, Fitzpatrick, and Byrne Citation2021). Arguably, all types of engineers could potentially work in high-risk sites or be exposed to risk at low-risk sites (Trevelyan Citation2021). Many major engineering companies, such as BHP (BHP Citation2023) and Rio Tinto (RioTinto Citation2022), front-load every major report with a safety analysis, one of the most important metrics within the reporting period. With such considerations, would the learning-by-doing environment of the laboratory be better suited to giving safety greater attention? Adding weight to such a proposal is that the accident rate in academic laboratories is about 10–50 times higher than in industrial laboratories (Wahab et al. Citation2021). This is correlated to an absence of hazard identification or risk analysis (Wahab et al. Citation2021). This suggests that the academic community could overlook the importance of safety, and greater emphasis is needed. C6 is ranked last and statistically different for software engineering. A possibility for this is that computer laboratories involve minimal physical equipment/instrumentation. This supports the notion that discipline-specific environments influence ranking order.

Analysing the cognitive data for the collective group from , the ranking order somewhat follows the taxonomy order of lower to higher order learning. As less formal alternatives to the hierarchy structure have been suggested (Atkinson Citation2013), this slight deviation to an exact order can be expected. While ‘understanding’ (C1) is a lower-order skill set, the importance of understanding in terms of scaffolding provides a good reason for its high importance. Beyond that, creating and evaluating dominate the first half of the list, with the lower-order skills concentrating in the second half. This alignment with the taxonomy provides some support to the value of the ranking order. Moreover, this also suggests that the academic engineering education community is thinking about laboratory objectives in a learning-beneficial scaffolded order.

The first study (Nikolic et al. Citation2022a) found that across international borders, ranking order in the cognitive domain was mainly consistent. The second study (Nikolic et al. Citation2022b) suggested that at a local level, discipline differences could explain the variations. This study supports those assumptions by highlighting how some discipline-specific factors can influence rankings. While an electrical (computer, electrical, electronics, and software) vs. non-electrical discipline grouping of differences in ranking order can be seen, the variations are minimal, and therefore we can conclude that the collective group provides a cognitive ranking order that represents most of engineering and is somewhat correctly scaffolded.

5.2. Psychomotor domain

As seen in , collectively, the averages range from 2.46 to 6.86 in order from most important to least of P1, P3, P2H, P6H, P4, P2S, P6S, P7, and P5. Compared to the cognitive domain, there is slightly more variation across the disciplines when looking at average values. This is not surprising, as different disciplines operate in different ways. For example, the scope of psychomotor activity for software engineers is limited, and this is reflected with the most different ranking order. As per the cognitive domain, similarities across disciplines were closest in an electrical vs. non-electrical related grouping.

Collectively, the psychomotor objectives ranked highest were P1, P3 and P2H, reflecting successful experimentation, planning & execution, and instrument use, respectively. This order mostly held intact across all disciplines (apart from software, due to a possible lower requirement to engage with instrumentation). Electrical-related disciplines showed a higher level of importance to P4 (constructing/coding) than the other disciplines. We hypothesise that this is due to the difference in experimentation practices.

Unsurprisingly, successful experimentation was ranked highest for most disciplines because achievement generally correlates with students obtaining the correct answer/output (Wolf Citation2010). Gaining the correct answer/output demonstrates that the student could embody the experimental skills required for success. This is also connected to planning and execution, as planning is linked to quality outcomes (Jiménez et al. Citation2015). The two objectives have a strong interconnection, hence why P1 and P3 dominated the top rankings. Apart from software engineering, the process of selecting, engaging and using a variety of instruments is fundamental to carrying out an experiment (Feisel and Rosa Citation2005), hence why P2H may also be highly ranked.

The lowest ranked items were P6S (which correlated against the less software-based disciplines), P7 (taking readings), and P5 (multi-sensory awareness to diagnose faults). The low ranking of P5 is interesting because when ‘fault finding’ is overlooked, it can play a major factor in reducing student satisfaction/experience, stifle student progress and is, therefore, a very important skill to scaffold (Nikolic Citation2015 Nikolic, Ritz, et al. Citation2015). While fault-finding is also a cognitive skill, evidence collected in (Nikolic, Vial, et al. Citation2015) highlights that cognitive ability is substantially of little value if the user does not know how to physically operate and manipulate the tools necessary for fault-finding in the laboratory. The study showed how Ph.D. students with the proper cognitive knowledge could be easily stumped in solving basic faults if they did not have the right psychomotor skills. Additionally, based on the first author’s industry experience, fault-finding is one of the key tasks undertaken in practice. Beyond being to operate the necessary fault-finding tools, the ability to move around and follow a scent or sound that seems not quite right; to use senses to observe a change of temperature; to understand how to move, shake or reposition things; or see warning errors and know how to respond by pushing, turning or pulling the right buttons/switches/levers is essential. Based on (Nikolic Citation2015 Nikolic, Ritz, et al. Citation2015; Nikolic, Vial, et al. Citation2015), it may be suitable to seek community reflection if P5 deserves higher importance. It will be of great interest to see if those working in industry agree. Therefore, a supplementary industry-based survey will be established as future work. P7 (taking readings) is associated with the lower-order skill of imitation in the psychomotor domain. Higher-order skills build upon and synthesise such a skill. Hence it is not surprising that it has a lower rank.

Analysing the psychomotor data for the collective group from , apart from P5, the ranking order generally follows the taxonomy order. This suggests that the academic engineering education community thinks about laboratory objectives in a learning-beneficial scaffolded order. Falling under the ‘articulate’ skill with the second highest importance in the taxonomy hierarchy, P5 may not be best suited to being last on the list. This supports the discussion above and suggests that academics may unconsciously downplay the importance of the psychomotor component of fault-finding. Only the mechanical discipline recognises P5 as a mid-ranked objective.

The first study (Nikolic et al. Citation2022a) found that across international borders, ranking order in the psychomotor domain was in complete alignment. The second study (Nikolic et al. Citation2022b) suggested that at a local level, discipline differences could explain the variations. This study supports those assumptions by highlighting how some discipline-specific factors can influence rankings. Some items, such as P6H/P6S and P4, had some clear variations based on discipline. This is not unexpected as the different disciplines engage with different equipment and activities. While software engineering may be the exception, the discipline variations are mostly minimal. The objectives somewhat follow Bloom’s Taxonomy scaffolding. Therefore, we can conclude that the collective group provides a psychomotor ranking order that represents most of engineering and is somewhat correctly scaffolded.

5.3. Affective domain

As seen in , collectively, the averages range from 2.49 to 5.50 in an order from most important to least of A1, A2, A3, A6, A5, A7, A4. There is little variation across the disciplines when looking at average values, apart from A7.

Collectively, the most important affective ranked items were A1, A2 and A3, reflecting teamwork, communication and independence. Every discipline apart from computer engineering had teamwork (A1) rated highest (this difference was statistically different) and communication (A2) ranked second or third. This is not surprising as the two items are interconnected (good communication is required for successful teamwork), and it corresponds to the growing calls for students to be able to collaborate with others, a primary function of the practising engineer (Almeida, Becker, and Villanueva Citation2021; Avila, Van Petegem, and Libotton Citation2021; Trevelyan Citation2014). In industry, working with others is a dominant activity (Shyr Citation2010). Students also recognise that teamwork and communication are priority generic competencies for their professional careers (Girotto and Oliveira Citation2022). Successfully implementing teamwork in a laboratory is not without its challenges. Students must be guided to overcome team conflicts, poor communication skills, free riders and personal differences (Vasquez et al. Citation2020).

The third highest ranked item, independence (A3), is somewhat opposite to objectives A1 and A2. Independence is also important because students must develop the capability of directing their learning (Wagener Citation2006). Therefore, a mix of individual and team-based experimentation can be beneficial.

The least important items concentrated mainly around A4 (ethics) and A7 (motivation). The low ethics ranking corresponds with growing calls to increase or reposition ethics teaching in engineering (Gwynne-Evans, Chetty, and Junaid Citation2021; Stransky et al. Citation2021; Valentine et al. Citation2020). This includes identifying and engaging with different elements of ethics across the curriculum within an engineering programme (Gwynne-Evans, Chetty, and Junaid Citation2021). For example, this could include greater awareness of ethics related to safety (Stransky et al. Citation2021), which would be well-suited to laboratory environments. This low ranking of ethics corresponds with the low ranking given to C6 (safety) in the cognitive domain. The first author’s experience when discussing laboratory ethics with colleagues is that many overlook how ethics can be applied, such as correct data recording even when results are not as expected. In fact, from the author’s experience, it generally takes an in-depth discussion with coordinators to raise awareness of the ethical factors overlooked. Moreover, some assessment rubrics can punish ethical measurement recordings and encourage unethical practices. This includes writing the correct answer when the readings measured by instrumentation are different in order to gain the required marks. If one can do that in the laboratory, can one do that to pass a quality test on the factory floor? E.g. to obtain a performance bonus.

The low ranking for motivation is at odds with how motivation is prioritised in much laboratory engineering education research. One of the primary drivers and research outcomes stated in papers is determining if educational innovation improves motivation, such as in (Ekin et al. Citation2021; Nedic, Nafalski, and Machotka Citation2010; Vojinovic et al. Citation2020). Motivation is important in helping one achieve and succeed due to the relationship between metacognition and the ability to self-regulate learning (Maslow Citation1943; Zimmerman and Moylan Citation2009).

Learning from failure (A6) was only highly ranked for computer engineering (statistically different from the collective). This reiterates the similar message discussed in the psychomotor domain related to fault-finding. It can also be related to motivation. Entrepreneurs, in particular, talk about the importance of failing repeatedly and learning from each failure. If not encouraged, it can impact student satisfaction and experience (Nikolic, Ritz, et al. Citation2015). This can be linked to resilience, an important skill to develop (Trevelyan Citation2014), especially in a COVID world of delivery challenges. It would be of interest to conduct a future study to explore why computer engineers rated this objective differently.

Analysing the affective data for the collective group from , the ranking order contrasts the other two domains with the order of importance somewhat in reverse to the expected taxonomy order. The first study (Nikolic et al. Citation2022a) found that across international borders, ranking order in the affective domain was not in alignment. The second study (Nikolic et al. Citation2022b) suggested that it is mostly aligned at a local level. This study suggests that the rankings are generally in alignment across disciplines, but some noticeable disciplinary differences are noted. Together the three studies suggest that a blend of local and discipline-based viewpoints might influence ranking in the affective domain the most. This highlights an exciting area for further research. Due to the local alignment, is it possible that accreditation or cultural values are a major influencer? Such a theory supports the different perspectives on the affective domain with academics and reviewers from other parts of the world the authors have come across.

With the scaffolding mismatch with Bloom’s Taxonomy, is it that engineers are so focused on the technical, that they do not give enough thought to the items in the affective domain? Or is it that they consider affective items more suitable for learning opportunities outside the laboratory? The data suggests that the scaffolding is well understood for the cognitive and psychomotor domains. Furthermore, affective skills can help develop emotional intelligence, a new skill in demand as it is recognised by many as fundamental for leadership and successful careers (Cerri Citation2016; Stein Citation2017; Parrish Citation2015). All these questions highlight that this is an area for further exploration.

5.4. All domains

provides an elementary analysis that determines the priority of domains within the objectives. Participants needed to rank the highest and lowest objectives from the individual rankings. While this approach does have its limitations, it was a simple exercise to avoid survey fatigue, as participants had already completed three rankings. As expected, the three domains are intertwined; that is, there was no case where a domain’s highest and lowest response was both more important than any objective from another domain. If all domains are important, why is the concentration of laboratory assessment focussed on the cognitive? (Nikolic, Ros, et al. Citation2021)

For all disciplines apart from materials, computer, the aggregate ‘other’, and software engineering, the ranking of domains in importance was cognitive, psychomotor and affective (for both highest and least important objectives). The dominance of the cognitive domain is not surprising, given its overwhelming focus in laboratory-based engineering education literature (Nikolic, Ros, et al. Citation2021). Materials, computer and other had a higher affective ranking than psychomotor for the higher ranked objectives. Software engineering completely bucked the trend and produced a very different order. The reasoning for these differences warrants further investigation in future studies.

On a collective basis, findings from this study conclude that academics believe that ‘understanding the operation of equipment/software used within the laboratory’ is the most important laboratory objective. The scaffolding value of this understanding presents an appreciation of its worth. On the other end of the spectrum, it has been identified that the least important learning objective is ‘consider ethical issues in laboratory experimentation and communication of discoveries’. With ethics growing in importance (Gwynne-Evans, Chetty, and Junaid Citation2021), future research may investigate if such academic opinion needs changing. This is especially the case if considering ethical practices related to safety in the laboratory or industrial workplaces.

6. Future work

The analysis from this study has suggested that further work is required to determine if industry opinion aligns with the academic views expressed in this survey. An investigation into student alignment would also be of interest. The analysis also suggests that a future study should explore the misalignment with the affective taxonomy. A reflection activity on some of the lower ranked items, such as safety, ethics and fault-finding, would be appropriate. It would also be interesting to understand why motivation is ranked low when it is one of the important metrics in many laboratory engineering education studies. Finally, as outlined in the introduction, the rankings of the objectives will be triangulated against assessment practices to determine if and how the most important laboratory objectives are indeed being assessed. Of particular interest will be to explore why laboratory reports are a dominant assessment method when the findings suggest that being able to produce laboratory reports is not an important learning objective.

7. Conclusion

In terms of answering the overarching research question, this study has explored academic opinions to determine the most important laboratory learning objectives across the cognitive, psychomotor and affective domains, using the Laboratory Learning Objectives Measurement instrument (Nikolic, Suesse, et al. Citation2021). The data from this study, together with the earlier analysis in (Nikolic et al. Citation2022a) and (Nikolic et al. Citation2022b), suggests that the collective ranking order does represent a framework that can be used broadly to represent academic opinion. The most important cognitive ranked items were C1 (understanding), C2 (design/modelling) and C7 (analysis). The psychomotor objectives ranked highest were P1 (successful experimentation), P3 (plan & execution) and P2H (instrument use). The most important affective ranked items were A1 (teamwork), A2 (communication) and A3 (independence).

Answering this study’s research question, is the ranking of importance different across disciplines? The data has suggested that discipline rankings should be used for greater accuracy. Location-based rankings can also be considered for the affective domain.

This study provides the following benefits to the academic engineering education community:

  • Academics can compare the laboratory learning outcomes for their courses, factoring in the various learning objectives with consideration of importance.

  • Researchers in laboratory-based studies can compare if the objectives they are trying to improve correlate to the community’s interests.

  • The academic community can reflect on the order of importance and determine if it is optimal. For example, the results from the affective domain suggest that some of the less important items need review. Just because we think it does not mean it is right and shouldn’t change. This work is a starting point for such conversations.

  • Researchers can scaffold this information to help build a holistic understanding of learning occurring in the laboratory.

  • Academics can explore if the assessments confirm competency across the important learning objectives.

  • Academics or researchers can duplicate this study locally to check for alignment, scaffolding the evidence for or against those found in this study.

  • Finally, while the ranking order has shown substantial alignment, academics can also reflect on whether this outcome is desirable.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Sasha Nikolic

Sasha Nikolic received a B.E. degree in telecommunications and a PhD in engineering education from the University of Wollongong, Australia, in 2001 and 2017, respectively. He is a Senior Lecturer of Engineering Education at the University of Wollongong. His interest is developing career-ready graduates involving research in teaching laboratories, artificial intelligence, industry engagement, work-integrated learning, knowledge management, communication, and reflection. Dr Nikolic has been recognised with many awards, including an Australian Award for University Teaching Citation in 2012 and 2019. He is a member of the executive committee of AAEE and an Associate Editor for AJEE and EJEE.

Thomas F. Suesse

Dr. Thomas F. Suesse completed his MSc (Dipl-Math) degree in mathematics at the Friedrich-Schiller-University (FSU) of Jena, Germany, in 2003. Dr Suesse then worked as a research fellow at the Institute of Medical Statistics, Informatics and Documentation (IMSID) and FSU. In 2005 he went to Victoria University of Wellington (VUW), New Zealand, to start his PhD in statistics and his degree was conferred with his thesis titled, 'Analysis and Diagnostics of Categorical Variables with Multiple Outcomes' in 2008. In 2009 Dr Suesse started working as a research fellow at the Centre for Statistical and Survey Methodology (CSSM) at the University of Wollongong. He was appointed as a lecturer at UOW in 2011 and promoted to senior lecturer in 2015. Currently he is at FSU on a research on a research fellowship.

Sarah Grundy

Sarah Grundy is an education-focused lecturer at the School of Chemical Engineering, The University of New South Wales. Sarah predominantly teaches design subjects at all levels (undergraduate to postgraduate). Sarah has over 15 years of experience in Research & Development, Manufacturing, and project management in industry. Sarah's passion is to develop students to be credible engineers and make their impact in whatever industry through authentic learning practices.

Rezwanul Haque

Dr. Rezwanul Haque is a Senior Lecturer specialising in Manufacturing Technology at the University of the Sunshine Coast. As an inaugural member of the AAEE Academy, he has contributed significantly to the academic community. In 2019, Dr. Haque served as an Academic Lead at the School of Science and Technology, overseeing the launch of two new Engineering programs and reviewing existing ones. His dedication to learning and teaching earned him the prestigious Senior Fellow status at the Higher Education Academy (UK) in the same year. His research focuses on Engineering Education and material characterisation through neutron diffraction.

Sarah Lyden

Sarah Lyden completed her BSc-BE (Hons) at the University of Tasmania in 2011. From 2012 to 2015 she was a PhD candidate with the School of Engineering and ICT at the University of Tasmania. From March 2015 to February 2018 Sarah was employed as the API Lecturer in the field of power systems and renewable energy. Since 2018, Sarah has been employed as Lecturer in the School of Engineering. Sarah has been a member of the School of Engineering and ICT's STEM education and outreach team.

Ghulam M. Hassan

Dr. Ghulam M. Hassan is Senior Lecturer in Department of Computer Science and Software Engineering at The University of Western Australia (UWA). He received his PhD from UWA. He completed MS and BS from Oklahoma State University, USA and University of Engineering and Technology (UET) Peshawar, Pakistan, respectively. His research interests are multidisciplinary problems, including engineering education, artificial intelligence, machine learning and optimisation in different fields of engineering and education. He is the recipient of multiple teaching excellence awards and is awarded AAEE Engineering Education Research Design Award 2021.

Scott Daniel

Scott Daniel is a Senior Lecturer in Humanitarian Engineering at the University of Technology Sydney, and serves as Deputy Editor at the Australasian Journal of Engineering Education and on the Editorial Boards of the European Journal of Engineering Education, the African Journal of Teacher Education and Development, and the Journal of Humanitarian Engineering. Scott uses qualitative methodologies to explore different facets of engineering education, particularly humanitarian engineering. He won the 2019 Australasian Association for Engineering Education Award for Research Design for his work with Andrea Mazzurco on the assessment of socio-technical thinking and co-design expertise in humanitarian engineering.

Marina Belkina

Dr. Marina Belkina is Lecturer and First Year Experience Coordinator at Western Sydney University. She has taught various subjects and courses (Foundation, Diploma, first and second years of Bachelor's Degree, online Associate Degree). She has implemented numerous projects to support learning, including: Creating the YouTube channel Engineering by Steps, Leading the development of HD videos for the first-year engineering courses, Developing iBook for physics, creating 3D lectures and aminations for Engineering Materials, and conducting research focused on exploring student's barriers to Higher Education.

Sulakshana Lal

Sulakshana Lal has a PhD in Engineering Education from Curtin University, Perth, WA, Australia. Her research focused on comparing the learning and teaching processes of face-to-face and remotely-operated engineering laboratories. With a keen interest in the intersection of technology and education, Sulakshana has published several articles in reputable journals and also presented her work at national and international engineering education conferences. Her expertise lies in understanding the nuances of different laboratories pedagogical settings and harnessing technology to enhance laboratory learning outcomes. Sulakshana is passionate about sharing her knowledge and helping educators and students navigate the evolving landscape of engineering education.

References

  • Almeida, L. M. S., K. H. Becker, and I. Villanueva. 2021. “Engineering Communication in Industry and Cross-Generational Challenges: An Exploratory Study.” European Journal of Engineering Education 46 (3): 389–401. https://doi.org/10.1080/03043797.2020.1737646
  • Anderson, L. W., et al. 2001. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives, Abridged Edition. White Plains, NY: Longman.
  • Atkinson, S. P. 2013. Taxonomy Circles: Visualizing the Possibilities of Intended Learning Outcomes. London: BPP University.
  • Avila, D. T., W. Van Petegem, and A. Libotton. 2021. “ASEST Framework: A Proposal for Improving Teamwork by Making Cohesive Software Engineering Student Teams.” European Journal of Engineering Education 46 (5): 750–764. https://doi.org/10.1080/03043797.2020.1863339
  • Balakrishnan, B., and P. C. Woods. 2013. “A Comparative Study on Real lab and Simulation lab in Communication Engineering from Students’ Perspectives.” European Journal of Engineering Education 38 (2): 159–171. https://doi.org/10.1080/03043797.2012.755499
  • Behera, A. K., R. A. de Sousa, V. Oleksik, J. Dong, and D. Fritzen. 2023. “Student Perceptions of Remote Learning Transitions in Engineering Disciplines During the COVID-19 Pandemic: A Cross-National Study.” European Journal of Engineering Education 48 (1): 110–142.
  • Bernhard, J. 2010. “Insightful Learning in the Laboratory: Some Experiences from 10 Years of Designing and Using Conceptual Labs.” European Journal of Engineering Education 35 (3): 271–287. https://doi.org/10.1080/03043791003739759
  • Bernhard, J. 2018. “What Matters for Students’ Learning in the Laboratory? Do Not Neglect the Role of Experimental Equipment!.” Instructional Science 46 (6): 819–846. https://doi.org/10.1007/s11251-018-9469-x
  • BHP. 2023. BHP Operational Review for the Half Year Ended 31 December 2022. https://www.bhp.com/-/media/documents/media/reports-and-presentations/2023/230119_bhpoperationalreviewforthehalfyear31december2022.pdf.
  • Bott, P. A. 1996. Testing and Assessment in Occupational and Technical Education. New York: ERIC.
  • Brinson, J. R. 2015. “Learning Outcome Achievement in non-Traditional (Virtual and Remote) Versus Traditional (Hands-on) Laboratories: A Review of the Empirical Research.” Computers & Education 87: 218–237. https://doi.org/10.1016/j.compedu.2015.07.003
  • Cerri, S. T. 2016. The Fully Integrated Engineer: Combining Technical Ability and Leadership Prowess, 1st ed. (IEEE PCS Professional Engineering Communication Series). New York: Wiley.
  • Ekin, S., J. F. O’Hara, E. Turgut, N. Colston, and J. L. Young. 2021. “A Do-It-Yourself (DIY) Light Wave Sensing and Communication Project: Low-Cost, Portable, Effective, and Fun.” IEEE Transactions on Education 64 (3): 205–212. https://doi.org/10.1109/TE.2020.3029543
  • Elawady, Y., and A. S. Tolba. 2009. “Educational Objectives of Different Laboratory Types: A Comparative Study.” International Journal of Computer Science and Information Security 6 (2): 89–96.
  • Faulconer, E. K., and A. B. Gruss. 2018. “A Review to Weigh the Pros and Cons of Online, Remote, and Distance Science Laboratory Experiences.” The International Review of Research in Open and Distributed Learning 19 (2). https://doi.org/10.19173/irrodl.v19i2.3386
  • Feisel, L. D., and A. J. Rosa. 2005. “The Role of the Laboratory in Undergraduate Engineering Education.” Journal of Engineering Education 94 (1): 121–130. https://doi.org/10.1002/j.2168-9830.2005.tb00833.x
  • Friedrich, S., F. Konietschke, and M. Pauly. 2019. “Resampling-Based Analysis of Multivariate Data and Repeated Measures Designs with the R Package MANOVA.RM.” The R Journal 11 (2): 380. https://doi.org/10.32614/RJ-2019-051
  • Gamo, J. 2019. “Assessing a Virtual Laboratory in Optics as a Complement to On-Site Teaching.” IEEE Transactions on Education 62 (2): 119–126. https://doi.org/10.1109/TE.2018.2871617
  • Garcia-Loro, F., et al. 2021. “Laboratories 4.0: Laboratories for Emerging Demands Under Industry 4.0 Paradigm.” 2021 IEEE Global Engineering Education Conference (EDUCON), 903–909.
  • Girotto, M., and A. Oliveira. 2022. “Undergraduate Students’ Perceptions of the Development of Generic Competences and Their Relevance to the Engineering Profession.” European Journal of Engineering Education 47 (6): 1061–1082. https://doi.org/10.1080/03043797.2022.2113763
  • Gutiérrez Ortiz, F. J., J. J. Fitzpatrick, and E. P. Byrne. 2021. “Development of Contemporary Engineering Graduate Attributes Through Open-Ended Problems and Activities.” European Journal of Engineering Education 46 (3): 441–456. https://doi.org/10.1080/03043797.2020.1803216
  • Gwynne-Evans, A. J., M. Chetty, and S. Junaid. 2021. “Repositioning Ethics at the Heart of Engineering Graduate Attributes.” Australasian Journal of Engineering Education 26 (1): 7–24. https://doi.org/10.1080/22054952.2021.1913882
  • Hargreaves, D. J. 1997. “Student Learning and Assessment Are Inextricably Linked.” European Journal of Engineering Education 22 (4): 401–409. https://doi.org/10.1080/03043799708923471
  • Jiménez, M., L. Romero, M. Domínguez, and M. del Mar Espinosa. 2015. “5S Methodology Implementation in the Laboratories of an Industrial Engineering University School.” Safety Science 78: 163–172. https://doi.org/10.1016/j.ssci.2015.04.022
  • Ka Yuk Chan, C. 2012. “Laboratory Learning.” In Encyclopedia of the Sciences of Learning, edited by N. M. Seel, 1705–1708. Boston, MA: Springer US.
  • Kollöffel, B., and T. de Jong. 2013. “Conceptual Understanding of Electrical Circuits in Secondary Vocational Engineering Education: Combining Traditional Instruction with Inquiry Learning in a Virtual Lab.” Journal of Engineering Education 102 (3): 375–393. https://doi.org/10.1002/jee.20022
  • Lal, S., Lucey Anthony D., Lindsay Euan D., Sarukkalige Priyantha R., Mocerino Mauro, Treagust David F., Zadnik Marjan G. 2017. “An Alternative Approach to Student Assessment for Engineering–Laboratory Learning.” Australasian Journal of Engineering Education 22 (2): 81–94. https://doi.org/10.1080/22054952.2018.1435202
  • Lindsay, E. D., and M. C. Good. 2005. “Effects of Laboratory Access Modes upon Learning Outcomes.” IEEE Transactions on Education 48 (4): 619–631. https://doi.org/10.1109/TE.2005.852591
  • Loui, M. C. 2016. “Board Changes and Neglected Research Topics.” Journal of Engineering Education 105 (1): 3–5. https://doi.org/10.1002/jee.20108
  • Maslow, A. 1943. “A Theory of Human Motivation.” Psychological Review 50 (4). https://doi.org/10.1037/h0054346
  • Masoud, M. I. 2017. “Writing a Laboratory Report for Senior Electrical Engineering Courses: Guidelines and Recommendations.” 2017 IEEE Global Engineering Education Conference (EDUCON), 340–346. IEEE.
  • May, D., et al. 2021. “Switching from Hands-on Labs to Exclusively Online Experimentation in Electrical and Computer Engineering Courses.” 2021 ASEE virtual annual conference.
  • May, D., G. R. Alves, A. A. Kist, and S. M. Zvacek. 2023. Online Laboratories in Engineering Education Research and Practice.
  • Morshead, R. W. 1965. “Taxonomy of Educational Objectives Handbook II: Affective Domain.
  • Nedic, Z., A. Nafalski, and J. Machotka. 2010. “Motivational Project-Based Laboratory for a Common First Year Electrical Engineering Course.” European Journal of Engineering Education 35 (4): 379–392. https://doi.org/10.1080/03043797.2010.490579
  • Nikolic, S. 2015. “Understanding How Students Use and Appreciate Online Resources in the Teaching Laboratory.” International Journal of Online and Biomedical Engineering (iJOE) 11 (4): 8–13. https://doi.org/10.3991/ijoe.v11i4.4562
  • Nikolic, S., et al. 2022a. “An Australian University Comparison of Engineering Laboratory Learning Objectives Rankings.” Presented at the 33rd Australasian Association for Engineering Education Conference, Sydney, Australia.
  • Nikolic, S., et al. 2022b. “A European vs Australasian Comparison of Engineering Laboratory Learning Objectives Rankings.” SEFI 50th annual conference, Barcelona, Spain. European Society for Engineering Education (SEFI).
  • Nikolic, S., S. Daniel, R. Haque, M. Belkina, G. M. Hassan, S. Grundy, S. Lyden, P. Neal, and C. Sandison. 2023. “ChatGPT Versus Engineering Education Assessment: A Multidisciplinary and Multi-Institutional Benchmarking and Analysis of This Generative Artificial Intelligence Tool to Investigate Assessment Integrity.” European Journal of Engineering Education 48: 559–614.
  • Nikolic, S., C. Ritz, P. J. Vial, M. Ros, and D. Stirling. 2015. “Decoding Student Satisfaction: How to Manage and Improve the Laboratory Experience.” IEEE Transactions on Education 58 (3): 151–158. https://doi.org/10.1109/TE.2014.2346474
  • Nikolic, S., M. Ros, K. Jovanovic, and Z. Stanisavljevic. 2021. “Remote, Simulation or Traditional Engineering Teaching Laboratory: A Systematic Literature Review of Assessment Implementations to Measure Student Achievement or Learning.” European Journal of Engineering Education 46 (6): 1141–1162. https://doi.org/10.1080/03043797.2021.1990864
  • Nikolic, S., T. Suesse, K. Jovanovic, and Z. Stanisavljevic. 2021. “Laboratory Learning Objectives Measurement: Relationships Between Student Evaluation Scores and Perceived Learning.” IEEE Transactions on Education 64 (2): 163–171. https://doi.org/10.1109/TE.2020.3022666
  • Nikolic, S., P. J. Vial, M. Ros, D. Stirling, and C. Ritz. 2015. “Improving the Laboratory Learning Experience: A Process to Train and Manage Teaching Assistants.” IEEE Transactions on Education 58 (2): 130–139. https://doi.org/10.1109/TE.2014.2335712
  • Parrish, D. R. 2015. “The Relevance of Emotional Intelligence for Leadership in a Higher Education Context.” Studies in Higher Education 40 (5): 821–837. https://doi.org/10.1080/03075079.2013.842225
  • Reeves, S. M., and K. J. Crippen. 2021. “Virtual Laboratories in Undergraduate Science and Engineering Courses: A Systematic Review, 2009–2019.” Journal of Science Education and Technology 30: 16–30.
  • RioTinto. 2022. Annual Report 2021. https://www.riotinto.com/-/media/content/documents/Invest/Reports/Annual-reports/Annual-report-2021/RT-Annual-report-2021.pdf?rev=0cc3e78061c341aca710df4c1a5e2ed3.
  • Salim, K. R., R. Ali, N. H. Hussain, and H. N. Haron. 2013. “An Instrument for Measuring the Learning Outcomes of Laboratory Work.” Presented at the International Engineering and Technology Education Conference, Ho Chi Minh City, Vietnam.
  • Shyr, W.-J. 2010. “Multiprog Virtual Laboratory Applied to PLC Programming Learning.” European Journal of Engineering Education 35 (5): 573–583. https://doi.org/10.1080/03043797.2010.497550
  • Simpson, E. J. 1972. “The Classification of Educational Objectives, Psychomotor Domain.”
  • Soomro, S. A., H. Casakin, and G. V. Georgiev. 2022. “A Systematic Review on FabLab Environments and Creativity: Implications for Design.” Buildings 12 (6): 804. https://doi.org/10.3390/buildings12060804
  • St Clair, S., D. Kim, and C. Riley. 2021. “Undergraduates’ Perspectives on Readiness, Writing Transfer, and Effectiveness of Writing Instructions in Engineering Lab Report Writing.” ASEE Annual Conference Proceedings.
  • Stefanovic, M., D. Tadic, S. Nestic, and A. Djordjevic. 2015. “An Assessment of Distance Learning Laboratory Objectives for Control Engineering Education.” Computer Applications in Engineering Education 23 (2): 191–202. https://doi.org/10.1002/cae.21589
  • Steger, F., A. Nitsche, A. Arbesmeier, K. D. Brade, H. Schweiger, and I. Belski. 2020. “Teaching Battery Basics in Laboratories: Hands-On Versus Simulated Experiments.” IEEE Transactions on Education 63 (3): 198–208. https://doi.org/10.1109/TE.2020.2970554
  • Stein, S. 2017. The EQ Leader: Instilling Passion, Creating Shared Goals, and Building Meaningful Organizations Through Emotional Intelligence. 1st ed. Hoboken: Wiley.
  • Stransky, J., C. A. Bodnar, M. Cooper, D. Anastasio, and D. Burkey. 2021. “Authentic Process Safety Decisions in an Engineering Ethics Context: Expression of Student Moral Development Within Surveys and Immersive Environments.” Australasian Journal of Engineering Education 26 (1): 117–126. https://doi.org/10.1080/22054952.2020.1809881
  • Trevelyan, J. 2014. The Making of an Expert Engineer. London: CRC Press/Balkema.
  • Trevelyan, J. P. 2021. Learning Engineering Practice. Leiden: CRC Press/Blakema.
  • U. O. Waterloo. 2022. Bloom's Taxonomy. Centre for Teaching Excellence, University of Waterloo. https://uwaterloo.ca/centre-for-teaching-excellence/teaching-resources/teaching-tips/planning-courses-and-assignments/blooms-taxonomy.
  • Valentine, A., S. Lowenhoff, M. Marinelli, S. Male, and G. M. Hassan. 2020. “Building Students’ Nascent Understanding of Ethics in Engineering Practice.” European Journal of Engineering Education 45 (6): 957–970. https://doi.org/10.1080/03043797.2020.1793913
  • Vasquez, E. S., M. J. Dewitt, Z. J. West, and M. J. Elsass. 2020. “Impact of Team Formation Approach on Teamwork Effectiveness and Performance in an Upper-Level Undergraduate Chemical Engineering Laboratory Course.” Int. J. Eng. Educ 36: 491–501.
  • Vojinovic, O., V. Simic, I. Milentijevic, and V. Ciric. 2020. “Tiered Assignments in Lab Programming Sessions: Exploring Objective Effects on Students’ Motivation and Performance.” IEEE Transactions on Education 63 (3): 164–172. https://doi.org/10.1109/TE.2019.2961647
  • Wagener, D. 2006. “Promoting Independent Learning Skills Using Video on Digital Language Laboratories.” Computer Assisted Language Learning 19 (4-5): 279–286. https://doi.org/10.1080/09588220601043180
  • Wahab, N. A. A., Nur Athyratul Aqila, Norain Isa, Nurul Izza Husin, Azrinawati Mohd Zin, Marina Mokhtar, and Nur Maizatul Azra Mukhtar. 2021. “A Systematic Review on Hazard Identification, Risk Assessment and Risk Control in Academic Laboratory.” Journal of Advanced Research in Applied Sciences and Engineering Technology 24 (1): 47–62. https://doi.org/10.37934/araset.24.1.4762.
  • Wolf, T. 2010. “Assessing Student Learning in a Virtual Laboratory Environment.” IEEE Transactions on Education 53 (2): 216–222. https://doi.org/10.1109/TE.2008.2012114
  • Wright, K., P. E. Slaboch, and R. Jamshidi. 2022. “Technical Writing Improvements Through Engineering lab Courses.” International Journal of Mechanical Engineering Education 50 (1): 120–134. https://doi.org/10.1177/0306419020939621
  • Zimmerman, B. J., and A. R. Moylan. 2009. “Self-regulation: Where Metacognition and Motivation Intersect.” In Handbook of Metacognition in Education, edited by D. J. Hacker, J. Dunlosky, and A. C. Graesser, 311–328. New York: Routledge.