343
Views
0
CrossRef citations to date
0
Altmetric
Articles

Leveraging technology for animal anatomy practicals

ORCID Icon & ORCID Icon
Pages 141-159 | Received 30 Aug 2022, Accepted 18 Apr 2023, Published online: 01 Jun 2023

Abstract

To deliver a comprehensive learning experience while shifting to online teaching due to COVID-19, educators at The University of Queensland (Australia) adopted Lt, a cloud-based platform, to overcome the challenges of delivering animal anatomy practicals. A two-phased study was conducted to evaluate the use of Lt for both online and on-campus students and its impact on student satisfaction and performance. Phase 1 investigated students’ satisfaction with the practical experience, with online students expressing greater satisfaction across all constructs related to the practical: design, Lt, and feedback. Phase 2 investigated end-of-course evaluations, with all evaluated items increasing from 2019 to 2020 and 2021, along with student performance showing no difference between the online and on-campus students for practical assessment items and final examinations. The findings give confidence for technology adoption to enhance the learning experience for online students and provides an exemplar for similar adoption for practical delivery across other science disciplines.

Introduction

The onset of COVID-19 led to universities across Australia to rapidly shift to an online teaching environment (McGaughey et al., Citation2021). At The University of Queensland, the improving COVID-19 numbers across the state resulted in a decision to resume on-campus teaching for the summer of 2020. However, with a large proportion of enrolled students living either overseas or across other states under various restrictions, all courses were also offered in an online mode (Mahdy & Sayed, Citation2022).

Online anatomy educational resources are not uncommon, with much research done of their impact on student satisfaction (Johnson et al., Citation2013; Mathiowetz et al., Citation2016), and performance (Attardi et al., Citation2018; Attardi & Rogers, Citation2015; Green et al., Citation2018). Despite the extensive literature on the use of online anatomy resources, before COVID-19, these were focused on human anatomy (Sugand et al., Citation2010) and included computer-aided interactive software (Mathiowetz et al., Citation2016) and online laboratories delivered through learning management systems (Attardi et al., Citation2018; Attardi & Rogers, Citation2015). In the early days of the pandemic, however, online animal anatomy resources were shared across institutions in support of the online transition (Evans et al., Citation2020). These resources had previously been found to enhance on-campus delivery rather than replace them (Martin et al., Citation2022). As a result, the pandemic launched much interest in the development and use of online veterinary anatomy resources as replacement for on-campus delivery (Kapoor & Singh, Citation2022; Mahdy & Sayed, Citation2022). However, these resources were focused on common veterinary animals such as dogs, cats, pigs, and horses (Martin et al., Citation2022), which would not serve purposefully for a general animal science course that covers all manner of animals, especially Australian wildlife. Furthermore, the development of these resources requires time and expertise that is not readily available especially considering the energy being required to transition the lectures and tutorials (Pather et al., Citation2020).

This challenge was preemptively identified by the late Dr. Tony Macknight, who started ADInstruments in 1986 to create a simple, flexible tool that enabled educators to capture and analyze data collected (ADInstruments, Citation2021). The developed software, Lt, was able to pair with on-campus laboratory equipment for students to collect data in real time, work in groups, and be engaged prior to, during, and post-laboratory sessions (Dutta, Citation2016). With the onset of the pandemic, ADInstruments offered their Lt platform to anatomy and physiology academics across the world. The platform was flexible in allowing educators to build lessons using their own resources or package together lessons from a vast range of readily available interactive activities (Calderon et al., Citation2022; Dutta, Citation2016; Halpin, Citation2022), alleviating some of the challenges identified in Pather et al. (Citation2020). Further to that, the platform also provided simulated data that could be used in lieu of actual data capture (Calderon et al., Citation2022). During the pandemic, educators teaching a range of anatomy and physiology courses across multiple levels used Lt to much success (Carrazoni et al., Citation2021; Duszenko et al., Citation2022; Lima et al., Citation2020; Smolle et al., Citation2021).

The implications on course delivery for courses with significant laboratory components such as animal and plant biology courses offered at the School of Agriculture and Food Sciences led to the pilot implementation of Lt on a first-year animal science course. Applied Animal Biology is an introductory animal science anatomy course delivered by the School of Agriculture and Food Sciences traditionally in a lecture, tutorial, and practical format. The course serves as one of the biggest at the school, catering to students from all internal programs, such as the Bachelor of Agriculture Science (Animal Science major) and Bachelor of Wildlife Science as well as programs from its partner school, the School of Veterinary Sciences. Prior to COVID-19, students were offered the opportunity to enrol in an external mode, where they engage with the lectures and tutorials through recordings, but they were expected to come to campus for a residential school to complete all course practicals sessions as a block. With COVID-19 and the rise of Zoom as the primary lecture and tutorial delivery mode for universities (Riedl, Citation2022), students were able to attend those sessions live rather than through recordings. However, the practical sessions were not able to make the same transition due to the highly hands-on, applied nature of anatomy practicals (Longhurst et al., Citation2020).

This study explored the impact of using Lt on student satisfaction and performance in an introductory animal science course.

Materials and methods

The study was carried out in two phases. Phase 1 involved the collection of student satisfaction data from students enrolled in both the online and on-campus modes to determine student satisfaction levels with the newly introduced online practical elements. Phase 2 involved the collation of student performance data for all individual assessment items and the university-wide end-of-semester course evaluations from the 2020 and 2021 cohorts tracing the impact the online practicals had on student performance and course ratings.

All statistical analysis on quantitative data collected was performed using SPSS version 27.0. Descriptive statistics, specifically means and standard deviations, for all quantitative data collected was determined. In comparing data sets, independent sample t tests were run at 95% confidence interval.

A thematic analysis was conducted on all qualitative data collected to determine aspects connected to the practical sessions. The thematic analysis process followed the 6-step approach illustrated by Xu and Zammit (Citation2020): (1) data familiarization, (2) initial code generation, (3) theme search, (4) thematic review, (5) theme definition, and (6) report production.

With over 12 years’ experience in learning design and active involvement in technology-enhanced learning initiatives across various institutions of higher education in different countries, I performed the qualitative analysis. In conducting the analysis, I relied on experience and understanding of the student cohort at the university to devise the codes and determine the themes that were identified.

Practical design

With the quick transition to online delivery, as well as lectures and tutorials smoothly transiting to Zoom, much emphasis was placed on practical delivery and providing students in the online mode with an experience that is comparable to that of the on-campus students. To assist with that challenge, the teaching team decided to adopt Lt, a cloud-based platform used to deliver highly interactive and engaging lessons. Lessons designed through the platform were used across both enrollment modes engaging students before, during, and after the practical session. provides an overview of the practical lesson design.

Figure 1. Overview of practical lesson design for online and on-campus delivery.

Figure 1. Overview of practical lesson design for online and on-campus delivery.

The key difference between the activities in the different modes occurs during the session. The on-campus students have hands-on activities, where they engage with the practical activities to help them to answer related questions on the platform. For the online students however, the Lt platform provides detailed videos and interactive simulations that walk them through the entire practical process providing information that they would have obtained if they were on-campus. Being an anatomy course, the information had a focus on tactile elements including step wise descriptions of anatomical markings and incision for inspection of sub-dermal and internal anatomical structures and muscle are arranged in various species of animals. This was further supplemented by video recordings and images captured by the teaching team. The platform allowed the teaching team to design a variety of activities engaging them for example in drag and drops, open-ended questions and diagram labeling. The platform also made it seamless for the teaching team to provide students with feedback regarding their answers and activities. Screenshots from sample lessons are shown in .

Figure 2. Screenshot from sample lessons.

Figure 2. Screenshot from sample lessons.

Participants

This study was carried out with students enrolled in a first-year anatomy course in 2020 and 2021. Phase 1 of the study involved the 2020 cohort while Phase 2 involved the 2021 cohort. The enrollment numbers for each year, with 2019 enrollment as comparison, along with the distribution across the two modes of study, are summarized in .

Table 1. Summary of enrollment.

The distribution of students across their programs was consistent with about 50% of the students in the Bachelor of Veterinary Technology program, followed by about 20% of them in the Bachelor of Wildlife Science program. The two other programs with appreciable numbers were the Bachelor of Agricultural Science and the Bachelor of Equine Science. The rest of the cohort were distributed across various programs as illustrated by .

Figure 3. Distribution of students’ programs.

Figure 3. Distribution of students’ programs.

All 293 students in the 2020 cohort were invited to participate anonymously in the satisfaction survey.

Ethics for the project was approved by The University of Queensland Institutional Human Research Ethics Committee approval 201900166, 2021/HE000888 and 2022/HE000113.

Satisfaction survey

The survey was developed in consultation with the course coordinator targeting student satisfaction levels split into four sections as shown in : (1) Design, which included statements relating to role of the sessions in increasing understanding of concepts and associated support materials; (2) Lt, which included statements related to their engagement with the activities and recommendations for its use in other courses; (3) Feedback, which included statements addressing their satisfaction with feedback received. (4) For online students, there was an additional section focusing on elements made available just for them—videos and images and a perception question as to whether they felt they had missed out by having their practicals online. Each section had between six and eight statements, and students rated their satisfaction by agreeing to each statement using a 5-point Likert scale ranging from strongly agree to strongly disagree. A decision was made to make the satisfaction survey a purely quantitative instrument, as the end-of-semester course evaluations contained qualitative questions seeking similar information related to student satisfaction.

Table 2. Constructs and associated questions in the satisfaction survey.

End-of-semester course evaluation

Across universities, it is common for students to be involved in evaluation of teaching quality through their responses to an institution-wide questionnaire often seeking their perception of course design, learning materials, and assessment practices among others (Anderson et al., Citation2005). The Student Evaluation of Course and Teacher (SECaT) is conducted at the end of each semester where students can provide feedback on their experience of courses and teaching at the university. Each time a course is offered, students enrolled in that course are invited to evaluate their course and teacher(s) via an online SECaT evaluation. These evaluations are conducted through the central evaluations department at the university, and individual teaching teams are provided with the aggregated data once the semestral results are released to the students. Students are invited to take part in the exercise through their emails from Week 12 (2nd last week of the semester), and the surveys remain open for 3 weeks. Students are sent up to three reminders, usually 5 days apart.

The SECaT questions were mapped to various aspects of the course comprising both quantitative and qualitative elements. The SECaT consists of eight statements where students rated their satisfaction by agreeing to each statement using a 5-point Likert scale ranging from strongly disagree to strongly agree. Students were also given an opportunity to express their thoughts, in open-ended questions, on the best aspects and areas for improvement (see ). With the practical aspect of the course changing from semester to semester, the SECaT give the teaching team an overarching view of students’ satisfaction and rating of the course as related to the practicals.

Table 3. Mapping of SECaT questions to course elements and types of data.

Student performance

A variety of assessments were implemented to assess student performance across the semester. This was greatly influenced by the differing teaching and learning environment that the 2020 and 2021 cohorts found themselves in. With the 2020 cohort having to transition online rapidly, much of the assessment items focused on engaging and supporting their online experience. However, in the 2021 cohort more were returning to campus, and the assessment shifted toward more of a traditional testing of knowledge and the application of skills. However, the practical assessment was kept at a weighting of 12% across both semesters. The scores for the practical assessment and final assessment were compared, due to the links in the final assessment to the practicals. Additionally, practical submission rates were also captured and compared across cohorts. The assessment distribution and weighting across the years is summarized in .

Table 4. Summary of assessment.

Results

Satisfaction survey

The satisfaction survey was conducted with the 2020 cohort. A total of 166 (56.7%) of students responded to the survey though the response rate for the on-campus enrollment was 75.2% but only 22.3% for online enrollment. Individual item scores were summed to get construct scores prior to statistical analysis. The practical structure and design had the highest mean score of 4.35 out of 5.0, and the use of Lt had a comparatively poor score of 2.98. The online cohort rated the Lt and feedback constructs higher than the on-campus cohort but rated the practical design construct comparatively lower. An independent t test was carried out to compare construct mean scores for the online cohort versus the on-campus cohort. The construct mean scores and Cronbach alpha values are summarized in .

Table 5. Overall course means for constructs and Cronbach α values.

The online experience construct related to the images and videos provided by the teaching team; only the online cohort responded to it, with a mean score of 4.03.

Practical design

The practical design construct had 6 items, with a course overall mean score of 4.35 out of 5.0. Our analysis showed that there was no significant difference in the ratings by the 23 online students (M = 4.22, SD = 0.42) compared to the 143 on-campus students (M = 4.37, SD = 0.49), t(164) = -1.350, p = 0.089.

Lt

The Lt construct had three items with a course overall mean score of 2.98 out of 5.0. Our analysis showed that the 23 online students (M = 3.81, SD = 1.06) rated Lt significantly better compared to the 143 on-campus students (M = 2.85, SD = 0.97), t(164) = 4.36, p < 0.001.

Feedback

The feedback construct had three items with a course overall mean score of 3.71 out of 5.0. Our analysis showed that the 23 online students (M = 4.05, SD = 0.67) rated feedback received significantly better compared to the 143 on-campus students (M = 3.66, SD = 0.76), t(164) = 2.38, p = 0.009.

End-of-semester course evaluation

The SECaTs are aggregated by the university evaluation team, and statistical data were provided to the teaching team along with qualitative data collected. The responses rates for each enrollment mode across the cohorts are provided in .

Table 6. Average submission rate for practical assessments.

The SECaT scores for both the 2020 and 2021 cohorts were very positive with most statements having mean scores of more than 4.0 out of a maximum of 5. This was a large improvement compared to the 2019 scores, as summarized in , which had lower mean scores averaging 3.50. The largest increase of mean scores was noted for Statement 6 relating to assessments, which moved from a mean score of 2.71 (2019 online cohort) to 4.03 (2020 online cohort) and 4.00 (2021 online cohort). The scores for the 2021 online cohort had comparatively lower means than the 2020 online cohort, except for Statement 2, which related to how intellectually stimulating the course was and which saw much improvement from a mean score of 3.43 (2019 online cohort) to 4.36 (2020 online cohort) and 4.43 (2021 online cohort). One statement that was directly attributed to the practicals was Statement 4, as the materials used in 2019, 2020, and 2021 were similar except for the incorporation of Lt from 2020. Statement 4 mean scores for the online cohort increased from 4.14 (2019) to 4.28 (2020) to 4.3 (2021). A summary of the SECaT scores is provided in and .

Figure 4. Summary of SECaT scores for course design (S1 and S3) and assessment and feedback (S5 and S6) elements for 2020 and 2021 cohort.

Figure 4. Summary of SECaT scores for course design (S1 and S3) and assessment and feedback (S5 and S6) elements for 2020 and 2021 cohort.

Figure 5. Summary of SECaT scores for perception of learning (S2 and S7), learning materials (S4) and overall course rating (S8) for 2020 and 2021 cohort.

Figure 5. Summary of SECaT scores for perception of learning (S2 and S7), learning materials (S4) and overall course rating (S8) for 2020 and 2021 cohort.

Table 7. 2019 SECaT scores.

An interesting observation was that for the 2020 cohort, the on-campus students had higher mean scores across all statements when compared to the online cohort. This was, however, different for the 2021 cohort, where the online cohort had higher mean scores across all statements. The overall course rating also saw an improvement for the online cohort from 3.43 to 4.39 (2020) and 4.18 (2021).

The qualitative questions asked students about the best aspects of the course as well as suggestions for improvements. The thematic analysis performed on the responses identified eight themes for both the best aspects and the suggestions for improvement related to various aspects of the course—course staff, course structure, course content, learning material, practical and assessment and feedback—which are summarized in and . The practical theme was of particular interest and is the focus of this Results section as we concluded that the other themes had no implication to the incorporation of Lt or the redesigned practical. There were 29 mentions of “practical” in the 2020 cohort responses and 87 mentions of “practical” in the 2021 cohort responses. As illustrates, the practical was the highest mentioned as the best aspect of the course for both on-campus cohorts. The 2020 online cohort had more mentions of practicals compared to the 2021 online cohort. The responses for the suggested improvements saw comparatively fewer mention of the practicals though this theme had the highest occurrence for the 2021 on-campus students (). This was reasoned as potentially due to the student expectation of on-campus practicals which was corroborated by the subthemes identified in and .

Table 8. Themes identified in thematic analysis from SECaT responses for best aspects of the course.

Table 9. Themes identified in thematic analysis from SECaT responses for suggested improvements.

Table 10. Subthemes from practical theme identified in thematic analysis from SECaT responses for best aspects of the course.

Table 11. Subthemes from practical theme identified in thematic analysis from SECaT responses for suggested improvements

In identifying the subthemes, responses that mentioned the practicals but provided no further information were omitted, and only responses that provided elaborations were categorized. The subthemes identified from the on-campus students in both 2020 and 2021 cohorts were the practical structure, the hands-on and face-to-face experiences, the use of wet specimens, and that the sessions were interesting, which was noted in the 2021 cohort with 19 responses. For the online cohorts, we also see that respondents highlighted the structure of the practical along with the engagement and learning assistance that it provided.

In terms of suggested improvements, the subtheme identified in the responses from both online cohorts was the amount of work involved in the practicals for a perceived low percentage weighting. The 2021 on-campus students highlighted the need for more hands-on activities, which was interesting as the hands-on experience was also deemed the best aspect by peers. The 2020 on-campus cohort had only two responses for categorization and both alluded to better structuring of the sessions.

Overall, the lack of specific mentions of the Lt platform gave the project team an indication that students saw the platform as being integrated with the practical activities. and summarize the subthemes identified for the qualitative responses in the SECaT.

Student performance

The two assessment items that were compared across the cohorts were the practical assessment and the final examination. The final examination for the 2020 cohort was scaled to a 30% weighting prior to comparison. For each assessment item, an independent t-test was carried out to analyze the following comparisons:

  • 2020 online cohort versus 2021 online cohort

  • 2020 online cohort versus 2020 on-campus cohort

  • 2021 online cohort versus 2021 on-campus cohort.

Practical assessment

The students’ practical assessment scores (scored out of 12 marks) for the 2020 and 2021 cohort are summarized in . For each cohort, we see that the on-campus students received a higher mean score compared to the online students. However, comparing across the cohorts, each enrollment mode performed similarly. A larger spread of the scores, illustrated by the standard deviation, was also noted for the online students compared to those on-campus.

Figure 6. Summary of practical assessment scores for the 2020 and 2021 cohorts.

Figure 6. Summary of practical assessment scores for the 2020 and 2021 cohorts.

Our analysis showed that for the 2021 cohort, the 237 on-campus students (M = 10.86, SD = 2.09) scored significantly better in the practical assessment compared to the 78 online students (M = 10.36, SD = 2.61), t(313) = 1.71, p = 0.044. However, there was no significant difference between the performance of the 190 on-campus students in the 2020 cohort (M = 10.71, SD = 2.27) compared to the 103 online students (M = 10.33, SD = 2.48) in that cohort, t(291) = 1.31, p = 0.096. There was also no significant difference between the performance of the 103 online students in the 2020 cohort (M = 10.33, SD = 2.48) compared to the 78 online students in the 2021 cohort (M = 10.86, SD = 2.61), t(179) = −0.08, p = 0.468.

The practical submission rates across both cohorts were high, averaging above 80% across the 2 cohorts. Despite the 2020 cohort having to submit more assessment items, 12 practical reports compared to six for the 2021 cohort, the submission rates were relatively consistent as summarized in .

Final examination

The students’ final examination scores (scored out of 30 marks) for the 2020 and 2021 cohort are summarized in . The data shows that the 2020 cohort had a higher mean score compared to the 2021 cohort. However, it is notable that within each cohort, the on-campus students and online students performed similarly. A larger spread of the scores, illustrated by the standard deviation, was also noted for the 2021 cohort as compared to the 2020 cohort.

Figure 7. Summary of final examination scores for the 2020 and 2021 cohorts.

Figure 7. Summary of final examination scores for the 2020 and 2021 cohorts.

Our analysis showed that the 103 online students in the 2020 cohort (M = 28.10, SD = 3.28) scored significantly better in the final examination compared to the 78 online students in the 2021 cohort (M = 16.58, SD = 5.94), t(112.16) = 15.42, p < 0.001. However, there was no significant difference between the performance of the 190 on-campus students (M = 28.32, SD = 1.83) compared to the 103 online students (M = 28.10, SD = 3.28) in the 2020 cohort, t(291) = 0.75, p = 0.227. There was also no significant difference between the performance of the 237 on-campus students (M = 16.26, SD = 4.56) compared to the 78 online students (M = 16.58, SD = 5.94) in the 2021 cohort, t(313) = -0.50, p = 0.308.

Discussion

The rapid transition to online delivery was critical to the continued progress and success of higher education (Duszenko et al., Citation2022; Lima et al., Citation2020) as an industry. The rapid reskilling ability of educators and the resilience of students to adapt to a new learning style have been positive outcomes of the global COVID-19 pandemic (Carnegie et al., Citation2021; Ewing & Cooper, Citation2021; Flynn et al., Citation2021). Arising from the challenges of delivering comprehensive engaging anatomy education, educators have not only developed a wealth of resources (Calderon et al., Citation2022; Mahdy & Sayed, Citation2022; Martin et al., Citation2022) but have also leveraged existing platforms such as Lt to better engage students (Carrazoni et al., Citation2021; Halpin, Citation2022; Smolle et al., Citation2021). This study illustrates the implementation of Lt in a hybrid model, supporting the delivery of practicals for students enrolled in both online and on-campus modes. Specifically, this study aimed to understand the impact on the online students and their experiences compared to those on-campus. The findings give much confidence in delivering a traditionally hands-on practical course through online means. The high student satisfaction levels show that students were satisfied with the use of Lt for the practical sessions, and that the 2020 student performance in practicals was not significantly affected by the lack of hands-on laboratory session due to the use of Lt. Together, these results can show the way for similar initiatives that could be adopted in other practical-based courses.

Students were very satisfied with the use of Lt for practical delivery, with satisfaction levels for the platform and feedback received being significantly greater for the online cohort compared with their on-campus peers. This was not unexpected as the on-campus students had various opportunities to engage with the teaching team, getting feedback and engaging in the activities, and so the Lt platform would not have been a difference-maker in their case. For the online students, however, this platform was their only engagement with the practical content and activities, including submissions and feedback. These findings are consistent with student satisfaction levels in other Lt implementations in various settings (Carrazoni et al., Citation2021; Halpin, Citation2022; Smolle et al., Citation2021).

The end-of-semester SECaT scores corroborate the satisfaction findings from the satisfaction survey with higher satisfaction ratings for both the 2020 and 2021 cohorts when Lt was implemented as compared to 2019 when it was not. As with any educational technological adaptation, both students and educators need time to embrace it and find the best way it fits within their learning schema (Chew et al., Citation2018). This was very evident in the SECaT scores, where the 2021 online cohort rated each statement higher than their on-campus peers, which is opposite to the 2020 ratings. This gave the teaching team confidence of the success of Lt’s implementation, more so as the lectures and tutorials were kept the same for both cohorts. The lack of direct practical reference for course improvements also adds to that confidence especially with identified themes such as the structure of the practicals along with Lt’s ability to engage students and support their learning as the best aspects of the course.

One limitation of the study is the comparatively small sample size of online students who participated in the satisfaction survey and SECaT evaluations. Though small, the response rates for course evaluations (Chapman & Joines, Citation2017; Lowenthal et al., Citation2015) and email surveys are comparable to those reported in the literature (Dommeyer et al., Citation2004). This has led to much research into possible suggestions on strategies to increase response rates to varying degrees of success (Crews & Curtis, Citation2011). However, research done by Avery et al. (Citation2006) highlighted that poor response rates did not affect mean scores. Similarly, as the response rates across the online cohorts are similar, the ratings are taken as representative of their cohort impression.

Exploring the impact of Lt on student performance, we note that the 2021 on-campus students performed significantly better than their online peers in the practical assessments though this difference was not observed in the final examination. The teaching team hypothesizes that the differing experiences during the practical sessions might have led to this. During the practical sessions on-campus, the students worked in groups and had allotted time to have discussions and were often able to complete the submissions within the session itself. The course tutors were also readily available to address doubts and queries that the students would have. Conversely, the online cohort attempted the required activities on their own and if they required clarifications, they had to seek them through the course discussion board and emails, which required comparatively more effort on their part. Additionally, the tactile experience that the on-campus students had compared to the online students might have contributed to their learning the practical material more effectively. Unfortunately, this hypothesis was not confirmed through any of the data collected in this study, which is a limitation of the study. Retrospectively, a study design that involved a post-course focus group or the incorporation of individual reflective exercises would allow a better understanding of the student experience in engaging with the activities and its impact on their performance. Likewise, though the 2020 online cohort performed significantly better than the 2021 online cohort in the final examination, this is not evident in their practical assessment comparison. This was expected as the difficulty level of the 2020 examinations was lowered and conducted as an open-book examination to help mitigate any COVID-19 related challenges students might have faced. Though the practical assessment was changed from 12 submissions in 2020 to six submissions in 2021, the submission rates giving an indication of student engagement in the activities were consistently high as summarized in . The performance of the online cohorts shows that the lack of physical data collection and laboratory sessions did not significantly affect knowledge gain. However, it must be acknowledged that the questions posed to students as part of the practical were based on the interactive activities, they engaged in. Thus, the practical submissions might not be a true reflection of what students have learned through the semester (Martin et al., Citation2022). In future studies, some consideration should be given to the employment of a pre- and post-study design to better measure student knowledge gain through the activity.

Table 12. Average submission rate for practical assessments.

Conclusion and future work

This pilot use of Lt at the School of Agriculture and Food Science has sparked interest in the tool at the university and has since found applications in animal physiology, medical endocrinology, and veterinary science courses. It has also been touted for use in equine science which has a dearth of online educational resources. The implementation model illustrated in this study has been adopted as a reference for these various implementations to varying degrees of success. The adoption across disciplines should be further investigated to determine if the nature of the subject matter influences the ability of Lt to bridge the online and on-campus delivery modes. With the online delivery mode set to stay for the foreseeable future, the use of Lt to assist student learning is set to continue and grow in its adoption. However, further work looking at the influence of Lt on student performance outcomes, potentially through control groups, could be undertaken.

Acknowledgments

The project team would like to acknowledge the collective efforts of ADInstruments, the technical services team and administrative team at The University of Queensland School of Agriculture and Food Sciences. Additionally, a special thanks goes out to tutors especially lead tutors Frank Burley, Eden Hurn, and Jade Copeland, whose dedication to student learning helped make this project a success.

Disclosure statement

No potential conflict of interest was declared by the authors.

Data availability statement

The data that support the findings of this study are available from the corresponding author, Suresh Krishnasamy, upon reasonable request.

Additional information

Notes on contributors

Suresh Krishnasamy

Suresh Krishnasamy is a teaching-focused academic at UQ, where he supports the teaching and learning needs of colleagues and students. His educational research career focuses on the adoption of educational technology, academic professional development, and the transition of first-year students from high school into undergraduate education.

Edward Narayan

Edward Narayan is a senior lecturer of Animal Science at UQ whose research is in the thematic areas of comparative vertebrate physiology, stress endocrinology, reproductive endocrinology, animal health and welfare, and conservation biology. He represents the university as a LINK member for the Universities Federation of Animal Welfare.

References

  • ADInstruments. 2021). Our story. https://www.adinstruments.com/company/our-story
  • Anderson, H. M., Cain, J., & Bird, E. (2005). Online student course evaluations: Review of literature and a pilot study. American Journal of Pharmaceutical Education, 69(1), 34–43. https://doi.org/10.5688/aj690105
  • Attardi, S. M., Barbeau, M. L., & Rogers, K. A. (2018). Improving online interactions: Lessons from an online anatomy course with a laboratory for undergraduate students. Anatomical Sciences Education, 11(6), 592–604. https://doi.org/10.1002/ase.1776
  • Attardi, S. M., & Rogers, K. A. (2015). Design and implementation of an online systemic human anatomy course with laboratory. Anatomical Sciences Education, 8(1), 53–62. https://doi.org/10.1002/ase.1465
  • Avery, R. J., Bryant, W. K., Mathios, A., Kang, H., & Bell, D. (2006). Electronic course evaluations: Does an online delivery system influence student evaluations? The Journal of Economic Education, 37(1), 21–37. https://doi.org/10.3200/JECE.37.1.21-37
  • √Calderon, B., Steel, C., Ford, B., Sue, J., & Bracewell, K. (2022). Lt: A resource to future-proof the laboratory in uncertain times. The Journal of Undergraduate Neuroscience Education, 20(2), A267–A277. https://www.funjournal.org/wp-content/uploads/2022/12/june-20-267.pdf
  • Carnegie, G. D., Guthrie, J., & Martin-Sardesai, A. (2021). Public universities and impacts of COVID-19 in Australia: Risk disclosures and organisational change. Accounting, Auditing & Accountability Journal, 35(1), 61–73. https://doi.org/10.1108/AAAJ-09-2020-4906
  • Carrazoni, G. S., Lima, K. R., Alves, N., & Mello-Carpes, P. B. (2021). Report on the online course "basic concepts in neurophysiology": A course promoted during the COVID-19 pandemic quarantine. Advances in Physiology Education, 45(3), 594–598. https://doi.org/10.1152/advan.00239.2020
  • Chapman, D. D., & Joines, J. A. (2017). Strategies for increasing response rates for online end-of-course evaluations. International Journal of Teaching and Learning in Higher Education, 29(1), 47–60. https://www.isetl.org/ijtlhe/pdf/IJTLHE2392.pdf
  • Chew, S. W., Cheng, I. L., Kinshuk, & Chen, N.-S. (2018). Exploring challenges faced by different stakeholders while implementing educational technology in classrooms through expert interviews. Journal of Computers in Education, 5(2), 175–197. https://doi.org/10.1007/s40692-018-0102-4
  • Crews, T. B., & Curtis, D. F. (2011). Online course evaluations: Faculty perspective and strategies for improved response rates. Assessment & Evaluation in Higher Education, 36(7), 865–878. https://doi.org/10.1080/02602938.2010.493970
  • Dommeyer, C. J., Baum, P., Hanna, R. W., & Chapman, K. S. (2004). Gathering faculty teaching evaluations by in‐class and online surveys: Their effects on response rates and evaluations. Assessment & Evaluation in Higher Education, 29(5), 611–623. https://doi.org/10.1080/02602930410001689171
  • Duszenko, M., Frohlich, N., Kaupp, A., & Garaschuk, O. (2022). All-digital training course in neurophysiology: Lessons learned from the COVID-19 pandemic. BMC Medical Education, 22(1), 3. https://doi.org/10.1186/s12909-021-03062-3
  • Dutta, K. K. (2016). Integration of digital/blended pedagogy and a data acquisition system to enhance anatomy and physiology laboratory teaching for allied health students: A learner-centric strategy. The FASEB Journal, 30(1), 776.29–776.29. https://faseb.onlinelibrary.wiley.com/doi/abs/10.1096/fasebj.30.1_supplement.776.29
  • Evans, D. J. R., Bay, B. H., Wilson, T. D., Smith, C. F., Lachman, N., & Pawlina, W. (2020). Going virtual to support anatomy education: A stopgap in the midst of the COVID-19 pandemic. Anatomical Sciences Education, 13(3), 279–283. https://doi.org/10.1002/ase.1963
  • Ewing, L.-A., & Cooper, H. B. (2021). Technology-enabled remote learning during covid-19: Perspectives of Australian teachers, students and parents. Technology, Pedagogy and Education, 30(1), 41–57. https://doi.org/10.1080/1475939X.2020.1868562
  • Flynn, W., Kumar, N., Donovan, R., Jones, M., & Vickerton, P. (2021). Delivering online alternatives to the anatomy laboratory: Early experience during the COVID-19 pandemic. Clinical Anatomy, 34(5), 757–765. https://doi.org/10.1002/ca.23722
  • Green, R. A., Whitburn, L. Y., Zacharias, A., Byrne, G., & Hughes, D. L. (2018). The relationship between student engagement with online content and achievement in a blended learning anatomy course. Anatomical Sciences Education, 11(5), 471–477. https://doi.org/10.1002/ase.1761
  • Halpin, P. A. (2022). Redesigning a face-to-face course to an asynchronous online format: A look at teaching pathophysiology with software that enhances student engagement. Advances in Physiology Education, 46(2), 339–344. https://doi.org/10.1152/advan.00031.2022
  • Johnson, I. P., Palmer, E., Burton, J., & Brockhouse, M. (2013). Online learning resources in anatomy: What do students think? Clinical Anatomy, 26(5), 556–563. https://doi.org/10.1002/ca.22219
  • Kapoor, K., & Singh, A. (2022). Veterinary anatomy teaching from real to virtual reality: An unprecedented shift during COVID-19 in socially distant era. Anatomia, Histologia, Embryologia, 51(2), 163–169. https://doi.org/10.1111/ahe.12783
  • Lima, K. R., das Neves, B. S., Ramires, C. C., Dos Santos Soares, M., Martini, V. A., Lopes, L. F., & Mello-Carpes, P. B. (2020). Student assessment of online tools to foster engagement during the COVID-19 quarantine. Advances in Physiology Education, 44(4), 679–683. https://doi.org/10.1152/advan.00131.2020
  • Longhurst, G. J., Stone, D. M., Dulohery, K., Scully, D., Campbell, T., & Smith, C. F. (2020). Strength, weakness, opportunity, threat (SWOT) analysis of the adaptations to anatomical education in the United Kingdom and Republic of Ireland in response to the COVID-19 pandemic. Anatomical Sciences Education, 13(3), 301–311. https://doi.org/10.1002/ase.1967
  • Lowenthal, P., Bauer, C., & Chen, K.-Z. (2015). Student perceptions of online learning: an analysis of online course evaluations. American Journal of Distance Education, 29(2), 85–97. https://doi.org/10.1080/08923647.2015.1023621
  • Mahdy, M. A. A., & Sayed, R. K. A. (2022). Evaluation of the online learning of veterinary anatomy education during the COVID-19 pandemic lockdown in Egypt: Students' perceptions. Anatomical Sciences Education, 15(1), 67–82. https://doi.org/10.1002/ase.2149
  • Martin, J. F., Arnold, O. R., Linton, A., Jones, J. D., Garrett, A. C., Mango, D. W., Juarez, K. A., Gloeckner, G., & Magee, C. (2022). How virtual animal anatomy facilitated a successful transition to online instruction and supported student learning during the coronavirus pandemic. Anatomia, Histologia, Embryologia, 51(1), 36–49. https://doi.org/10.1111/ahe.12799
  • Mathiowetz, V., Yu, C. H., & Quake-Rapp, C. (2016). Comparison of a gross anatomy laboratory to online anatomy software for teaching anatomy. Anatomical Sciences Education, 9(1), 52–59. https://doi.org/10.1002/ase.1528
  • McGaughey, F., Watermeyer, R., Shankar, K., Suri, V. R., Knight, C., Crick, T., Hardman, J., Phelan, D., & Chung, R. (2021). ‘This can’t be the new norm’: Academics’ perspectives on the COVID-19 crisis for the Australian university sector. Higher Education Research & Development, 41(7), 1–16. https://doi.org/10.1080/07294360.2021.1973384
  • Pather, N., Blyth, P., Chapman, J. A., Dayal, M. R., Flack, N., Fogg, Q. A., Green, R. A., Hulme, A. K., Johnson, I. P., Meyer, A. J., Morley, J. W., Shortland, P. J., Strkalj, G., Strkalj, M., Valter, K., Webb, A. L., Woodley, S. J., & Lazarus, M. D. (2020). Forced disruption of anatomy education in Australia and New Zealand: An acute response to the COVID-19 pandemic. Anatomical Sciences Education, 13(3), 284–300. https://doi.org/10.1002/ase.1968
  • Riedl, R. (2022). On the stress potential of videoconferencing: Definition and root causes of Zoom fatigue. Electronic Markets, 32(1), 153–177. https://doi.org/10.1007/s12525-021-00501-3
  • Smolle, J., Rössler, A., Rehatschek, H., Hye, F., & Vogl, S. (2021). Lecture recording, microlearning, video conferences and Lt-platform–medical education during COVID-19 crisis at the Medical University of Graz. GMS Journal for Medical Education, 38(1). https://doi.org/10.3205/zma001407
  • Sugand, K., Abrahams, P., & Khurana, A. (2010). The anatomy of anatomy: A review for its modernization. Anatomical Sciences Education, 3(2), 83–93. https://doi.org/10.1002/ase.139
  • Xu, W., & Zammit, K. (2020). Applying thematic analysis to education: A hybrid approach to interpreting data in practitioner research. International Journal of Qualitative Methods, 19, 1–9. https://doi.org/10.1177/1609406920918810