449
Views
0
CrossRef citations to date
0
Altmetric
Research and Teachings

Comparison of Knowledge Gained in a Face-to-Face Versus an Online College-Level Nutrition Course

ORCID Icon
Pages 87-94 | Received 29 Nov 2021, Accepted 31 Mar 2022, Published online: 30 Jan 2024

Abstract

Although evidence exists that online education can result in comparable outcomes to the equivalent face-to-face (F2F) version of a course, there is still a dearth of literature. The objective of this pilot was to investigate differences in academic performance between students participating in the F2F versus online versions of Nutrition 10, an introductory, college-level nutrition course. Students enrolled in Nutrition 10 (F2F; n = 907) and Nutrition 10V (virtual; n = 1,239) completed a 27-item nutrition knowledge questionnaire before (pre-) and after the class (post-) developed from the learning objectives (α = 0.92). Students who took the class, regardless of delivery method, improved their nutrition knowledge (+6.9 points; p < 0.01). In the F2F class, students exhibited a greater improvement in nutrition knowledge (+7.7 points; p < 0.01) compared with the virtual class (+6.3 points; p < 0.01). There were significant differences in grades based on the quarter when the course was offered. Fall 2019 F2F students received a grade that was 3% higher than the virtual course (p < 0.05), whereas winter 2020 virtual students received a grade that was 1.2% higher than the F2F course (p < 0.01). Although F2F and online education share many similarities, there are still significant differences between the two modalities.

Online learning is a growing sector in higher education and is reported to be one of the most coveted trends in educational uses of technologies (Fischer et al., Citation2020; Paul & Jefferson, Citation2019; Wei & Chou, Citation2020). A study by Allen and Seaman (Citation2013) found that about one in three students took at least one online course during their academic career. According to the U.S. Department of Education, an estimated one third of college students took at least one online course prior to the COVID-19 pandemic (De Brey et al., Citation2021). As online enrollments continue to increase, the number of students exclusively taking face-to-face (F2F) courses has been declining, with public and not-for-profit institutions attracting the greatest numbers of students taking online courses (Palvia et al., Citation2018). However, this trend is not necessarily due to proximity to campus, as more than half of students taking an online class also took a course on campus at the same time (Palvia et al., Citation2018).

Due to technological advancements, online education can be accessed almost anywhere in the world and attracts a more diverse pool of students, such as nontraditional students with full-time jobs (Palvia et al., Citation2018). Research shows that students appreciate the flexibility online education provides and enjoy self-paced learning (Paechter & Maier, Citation2010; Palvia et al., Citation2018). Students also report finding traditional F2F modality restrictive and inflexible (Paul & Jefferson, Citation2019). With advancing technologies, online education has become increasingly viable technologically, economically, and operationally (Palvia et al., Citation2018; Soffer & Nachmias, Citation2018). Research also suggests that the online format could potentially lower costs of delivery and increase access (Goodman et al., Citation2019; Palvia et al., Citation2018). Historically, F2F education tends to be teacher-centered, with students passively taking in the information, whereas online education is more student-centered, with students actively engaging with the content (Paul & Jefferson, Citation2019). Students who may be hesitant to speak in class discussions or ask questions in a F2F environment may feel more comfortable participating in the online format (Paul & Jefferson, Citation2019). However, there are many challenges associated with online learning, including isolation, attention span, course navigation, self-monitoring, technical difficulties, and academic integrity (Paul & Jefferson, Citation2019). There have been reports of high dropout rates and achievement gaps in online courses, so it is important to identify factors that contribute to these problems (Kebritchi et al., Citation2017; Paul & Jefferson, Citation2019). Effective communication can be difficult to implement in an online medium because of the lack of physical meetings (Soffer & Nachmias, Citation2018). The transition to online education removes an instructor’s ability to communicate with students during an in-person class through verbal and nonverbal communication (Kebritchi et al., Citation2017). Online education is a self-directed experience that can be challenging for students, so learner readiness has been shown to be a predictor of success in online classes (Kebritchi et al., Citation2017). Identifying learner readiness with regard to self-motivation and self-direction could help lead to improved learning experiences in online education, yet not all instructors utilize tools that can help them identify their students’ readiness prior to the start of the class (Kebritchi et al., Citation2017).

Academic integrity also remains a major concern in online education because the physical interaction between instructor and student is removed (Daffin, Jr., & Jones, Citation2018). This lack of physical interaction means students could use unauthorized resources to complete coursework; collaborate with others on closed-book, individual assignments; plagiarize assignments; or even hire people do their work for them (Daffin, Jr., & Jones, Citation2018; Gudiño Paredes et al., Citation2021). Online proctoring companies can help facilitate and enforce individual, closed-book exams, but these can come with a financial burden, technology considerations, concerns about privacy, and heightened testing anxiety (Hussein et al., Citation2020). However, the evidence on whether online learning environments lead to greater instances of academic misconduct has been inconclusive (Gudiño Paredes et al., Citation2021).

The need for efficient and effective teaching methods in online education is a priority. It can be challenging to create new content and adapt preexisting course materials to an online setting; there is also a lack of proper training and support for many online course instructors (Kebritchi et al., Citation2017). The growth of online education has opened new educational research opportunities, and emerging studies aim to understand the factors that constitute success or failure in an online course (Wei & Chou, Citation2020). Yet, there remains a widespread lack of comprehensive understanding of online pedagogy and online learning styles, in addition to limited availability of administrative resources and support (Palvia et al., Citation2018). Among other concerns, the lack of understanding, resources, and support can lead to poor learning outcomes, faculty resistance to online education, and employer bias against online degrees (Palvia et al., Citation2018). Thus, it is imperative that instructors understand online pedagogy and thoughtfully design online courses using educational theory to ensure students meet the intended learning outcomes.

Distance learning encompasses a variety of formats and is collectively referred to as online learning (Palvia et al., Citation2018). However, the turn to remote education due to the COVID-19 pandemic served as an impetus to define these online learning styles and formats. Fully online education is where all instruction, testing, assignments, and discussion take place online and the course is thoughtfully designed using educational theory. Blended or hybrid teaching utilizes a combination of online learning and F2F activities. Emergency remote teaching is the shifting of courses to a fully online format in the event that faculty or students are unable to come to campus. Remote teaching can be taught synchronously (i.e., lecturing in real time using a web-based platform such as Zoom), asynchronously (i.e., pre-recorded lectures posted on the learning management system), or a combination of these two approaches (Dhawan, Citation2020).

The quality of online education has emerged as a priority area of educational research, and the research on its efficacy remains divided (Driscoll et al., Citation2012; Soffer & Nachmias, Citation2018). One study by Paul and Jefferson (Citation2019) found no significant difference in performance (i.e., final course grade) between the F2F and online versions of an environmental science class. Although there is supporting evidence that online education can result in comparable outcomes to the equivalent F2F version of a course, there is still a dearth of literature (Paul & Jefferson, Citation2019; Soffer & Nachmias, Citation2018). Many of these limited studies measure learning performance by final grade or grade point average, neither of which captures the learning demonstrated through formative assessments (Wei & Chou, Citation2020). Thus, there is a need to compare learning outcomes between F2F and online versions of the same course to evaluate whether or not the same learning goals are achieved.

Nutrition 10: Discoveries and Concepts in Nutrition is a 3-unit, high-enrollment introductory nutrition course that enrolls more than 500 students each quarter and is offered year-round at the University of California (UC), Davis. Nutrition 10 fulfills the science and engineering general education requirement and is designed for nonscience majors. To accommodate additional students, Nutrition 10 received UC Innovative Learning Technology Initiative (ILTI) funds in 2016 to support its development into a fully online course (Nutrition 10V). This grant also enabled the course to be available to students from all 10 UC campuses.

The objective of this pilot study was to investigate differences in academic performance between students participating in the F2F and online versions of Nutrition 10. The specific aim of this pilot was to identify any differences in student learning outcomes between the face-to-face and online nutrition courses and utilize these findings to improve the course so that students can achieve the same learning outcomes, regardless of course delivery method.

Methods

To develop this fully online course, a development team was formed in 2017, which included the Nutrition 10 instructor, an instructional designer from UC Davis Academic Technology Services (ATS), an education specialist from the UC Davis Center for Educational Effectiveness, and two graduate student researchers (one doctoral candidate with a background in nutrition education and one doctoral candidate with a background in education). Relying upon effective pedagogical and instructional design principles, Nutrition 10 (F2F) was developed into Nutrition 10V (virtual), a fully online course. Technology used during initial production and teaching included the Canvas Learning Management System (LMS), the ATS eLearning Studio (location for filming of lecture content), Aggie Video (cloud-based video management system that stores, hosts, and distributes all faculty-generated videos), PlayPosit (interactive web-based video platform that allows educators to embed questions in videos for formative assessment), Piazza (online question-and-answer platform), Zoom (video platform used for facilitating online office hours), ProctorU (remote cloud-based proctoring service that allows for closed-book, online examinations with a live proctor), and social media. Nutrition 10V was initially offered as a pilot to 100 UC Davis students in winter quarter 2018. After the pilot period, Nutrition 10V enrolled more than 700 students in spring quarter 2018. Currently, Nutrition 10V regularly enrolls between 700 and 900 students (including cross-campus UC students) and is offered in the fall, winter, and spring quarters, in addition to Summer Session II. The course structure of Nutrition 10 and Nutrition 10V is outlined in Supplementary Appendix 1.

In winter 2019, a 27-item questionnaire was developed using the six course learning objectives to assess course-specific nutrition knowledge with the purpose of providing a reliable evaluation tool for measuring change in knowledge (Supplementary Appendix 2). Three nutrition education experts reviewed the questionnaire for content validity. Six nutrition science subject matter experts also evaluated the nutrition knowledge survey questions by matching each one to a learning objective, rating their confidence in the match, and rating the item’s relevance to the matched learning objective for acceptable congruence. Pretest data from fall quarter 2019 were analyzed for item-to-total score correlation. Cronbach’s alpha was used to determine internal consistency for each section of the survey, with a minimum acceptable level of α > 0.7. The corresponding questions that matched to each learning objective (LO) were found to have adequate internal consistency (α = 0.84 [LO1]; 0.60 [LO2]; 0.75 [LO3]; 0.73 [LO4]; 0.60 [LO5]; 0.70 [LO6]). The overall internal consistency reliability was 0.92.

Students enrolled in Nutrition 10 (F2F) and Nutrition 10V (virtual) during fall quarter 2019 and winter quarter 2020 were invited to participate in this pilot study. Prior to the course beginning (pre-) and after the conclusion of the course (post-), students were asked to complete the nutrition knowledge questionnaire on the Canvas LMS. Students received 1 point of extra credit for completing the precourse survey and 2 points of extra credit for completing the postcourse survey. The precourse survey was released during the first week of the quarter as the first assignment in the class when the workload was less. The postcourse survey was released during the last week of the quarter when students were finishing all remaining assignments and motivated to do extra work to increase their grade. Multiple announcements and reminders were sent out about the pre- and postcourse surveys to encourage participation, including a 1-week reminder and a reminder on the morning the survey was due. Additional data collected from students included demographics, course grades, and anonymous course feedback. Responses from the nutrition knowledge survey, demographics, and grades were matched and used as aggregate data for subsequent analyses. The UC Davis Human Subjects Institutional Review Board approved the pilot study.

Statistical Analysis

Analyses were conducted on data from students who completed both pre- and postcourse assessments. For all outcomes, means and standard deviations (SDs) for each class were calculated, and distributions were examined for normality using histograms, skewness, and kurtosis. Change in nutrition knowledge was calculated by subtracting precourse scores from postcourse scores. Descriptive statistics were expressed as means and SDs for continuous variables and percentages for categorical variables. Baseline characteristics were compared between the groups. Categorical variables were calculated into percentages, and groups were compared using the chi-squared test for homogeneity. Paired t-tests were used for pre- and postcourse comparisons within each class, and students’ t-tests were used for pre- and postcourse comparisons between each class. Stata 16 software (StataCorp, College Station, TX, 2019) was used for all statistical analyses, and significance was determined using p ≤ 0.05.

Results

There were 1,029 students enrolled in Nutrition 10 (F2F) and 1,560 students enrolled in Nutrition 10V (virtual) during the study period. Of these students, 978 F2F students and 1,444 virtual students participated in the pilot study, giving an overall participation rate of 95% for the F2F class and 93% for the virtual class. To account for potential differences between quarters, sub-analyses were also conducted. For fall quarter 2019, of the 481 students enrolled in Nutrition 10 (F2F) and 802 students enrolled in Nutrition 10V (virtual), 467 F2F and 762 virtual students participated in the pilot study. For winter quarter 2020, of the 548 students enrolled in Nutrition 10 (F2F) and 758 students enrolled in Nutrition 10V (virtual), 511 F2F and 682 virtual students participated in the pilot study.

Baseline characteristics of students who participated in the pilot study are presented in Supplementary Appendix 3. There were significant differences in gender, age, race and ethnicity, parental education level, enrollment status, transfer student status, and international student status. The majority of participants identified as female (70% in F2F and 65% in virtual) and Asian/Pacific Islander (42% in F2F and 58% in virtual). The mean age of participants was between 19 and 20 years old, and mostly first-year students registered for the course, especially for the F2F class. However, the virtual class had greater representation from third- through sixth-year students. There were also more transfer students and international students enrolled in the virtual class, as compared with the F2F class.

Students who took the class, regardless of delivery method, improved nutrition knowledge (+6.9 points; 95% CI [15.0–8.2]; p < 0.01; ). Students in the F2F class exhibited a significantly greater improvement in nutrition knowledge (+7.7 points; 95% CI [16.0–8.3]; p < 0.01) than those in the virtual class (+6.3 points; 95% CI [14.1–7.9]; p < 0.01; ). This greater improvement in the F2F class compared with the virtual one was also seen when conducting a sub-analysis by the quarter when the class was offered ().

Table 1. Comparison of nutrition knowledge pre- and post-measure.

The comparison of mean scores on class assessments is shown in . The mean grade was 93.1% for students in the F2F class and 92.3% for students in the virtual class (p = 0.08). However, there were significant differences between overall course grades when conducting a sub-analysis by the quarter when the class was offered (). Students enrolled in the F2F course during fall quarter 2019 received a final course grade that was 3% higher than students enrolled in the virtual course (p < 0.05), whereas students enrolled in the virtual course during winter quarter 2020 received a final course grade that was 1.2% higher than students enrolled in the F2F course (p < 0.01). When examining the combined quarters, there were significant differences for grades on the midterm exam #2, quizzes, the food diary project, and extra credit. For these assessments, the F2F course consistently outperformed the virtual course, with the exception of the quizzes. However, the quarter when the class was offered resulted in varying differences. For fall quarter, there were significant differences for grades on the midterm exam #1, midterm exam #2, final exam, and the food diary project, with the F2F course outperforming the virtual course. For winter quarter, there were significant differences for grades on the midterm exam #1 and quizzes, with the virtual course outperforming the F2F course.

Table 2. Comparison of mean scores on class assessments.

Discussion

The present study found that students enrolled in both the F2F and online course improved their nutrition knowledge by 26%, regardless of course delivery format. However, students in the F2F class had a greater improvement in nutrition knowledge than students in the online class. These findings align with previous research that found students in F2F courses tend to exhibit better academic performance than students who take the online offering (Hurlbut, Citation2018; Summers et al., Citation2005). Findings from previous studies have found that students in the F2F may perform better due to the increased accountability of attending class in person and having consistent communication with the instructional team and their peers (Bandara & Wijekularathna, Citation2017).

However, there is also relevant research that does not show differences in learning outcomes or that online learners performed better than F2F learners (Burkhardt et al., Citation2008; Neuhauser, Citation2002; Paul & Jefferson, Citation2019; Russell et al., Citation2018; Soffer & Nachmias, Citation2018; Wu, Citation2015). One recent review examining differences between online and F2F education found that almost half of the studies included resulted in improved learning outcomes with online learning, whereas only 18% of the studies favored F2F learning (Stevens et al., Citation2021). These mixed results seen in the literature could be due to a variety of factors and pedagogical variables (Dell et al., Citation2010). User-friendly design and adequate technological support are critical components of a successful online education experience, yet there have been numerous reports that many online classes are not adequately supported, which could contribute to poorer learning outcomes (Castro & Tumibay, Citation2019; Dumford & Miller, Citation2018). Therefore, focusing efforts on quality course design instead of the characteristics of the media may lead to more beneficial learning in online courses (Castro & Tumibay, Citation2019; Dell et al., Citation2010). Student characteristics such as access to resources, experience with learning tools, existing study skills, and personal traits have all been found to be critical elements for success in an online learning environment (Hurlbut, Citation2018).

Research has also found that online learners can have different expectations than in-person learners, such as not taking assignment deadlines seriously (Kebritchi et al., Citation2017). Online learners may need additional motivation, organization, and self-discipline to achieve the intended learning outcomes (Dumford & Miller, Citation2018). To help improve self-monitoring skills, students in both Nutrition 10 (F2F) and Nutrition 10V (virtual) were sent multiple reminders through the LMS announcements feature before assignments were due. However, the students enrolled in the F2F course also received the same reminders verbally at the beginning of a lecture, whereas the online students received only written reminders. A meta-analysis by Means and colleagues (Citation2010) found that active learning, self-reflection, self-regulation, and self-monitoring can help create more positive online learning outcomes. Students in both classes completed weekly quizzes, and the online class performed significantly better than the F2F class on the quizzes. This outcome helps support the idea that incorporating structured quizzes in an online class could help bolster students’ self-monitoring skills and lead to improved academic performance. Students in the online class also had the chance to practice answering similar questions through the embedded questions in the pre-recorded videos that were created using the PlayPosit software, whereas the in-person students were exposed to think-pair-share questions and student-generated questions during lectures. However, students in the F2F class generally performed better on summative assignments, which may be because they participated in more class activities, such as review sessions, extra credit activities, and lectures. More information is needed about the attendance of F2F versus online learners to determine the cause of the grade differences.

Both classes completed closed-book, proctored exams during the course, and students were not allowed to use any resources besides a nongraphing calculator (e.g., no scrap paper or study guides). The F2F students took their exams on paper in person at the designated class time in the lecture hall with teaching assistant proctors, whereas the online students took their exams online through the Canvas LMS using ProctorU, a virtual proctoring company. Online students were given a 12-hour window to take the 1-hour exam, and they scheduled their exam in advance through ProctorU. When it was time to begin the exam, students logged into their ProctorU portal and completed the identification verification, and they were then connected to a ProctorU live proctor. The virtual proctor would have the student show them their surroundings through the webcam to ensure there were no resources, phones, or people around them. After this check, the proctor let the student into the password-protected exam. The session was automatically recorded, and the virtual proctor could view the student’s screen and the student through their webcam. Instances of potential academic misconduct were automatically flagged (e.g., student looking off-screen) and sent to the ProctorU review committee. Incidents were flagged as either green (low risk), yellow (medium risk), or red (high risk). The instructor could then review the description of the incidents, view the recorded footage, and assess the situation. Because use of the proctoring service was part of a university pilot, the cost associated with online exams was waived for students. However, students needed access to a working webcam and could not use tablets, phones, or Chromebooks due to software requirements. Course evaluations consistently showed that students felt uncomfortable with the virtual proctoring process, which may have contributed to their weaker performance on proctored exams compared with the in-person students. Future online classes should consider the advantages and limitations of utilizing remote proctoring services.

Some learners prefer an in-person environment when engaging in collaborative group work or establishing social relations with peers (Paechter & Maier, Citation2010). Online learners may also feel isolated and disconnected, which can contribute to negative learning experiences (Kebritchi et al., Citation2017). Learners equipped with self-directed learning tend to be more successful in online education; this finding has also been seen in learners with greater self-efficacy (Kebritchi et al., Citation2017). Thus, there is a need to measure learner readiness so that instructors can identify students who may need additional support (Kebritchi et al., Citation2017).

Online learners report that the structured communication within an online course helps them collect and share knowledge (Paechter & Maier, Citation2010). For both Nutrition 10 (F2F) and Nutrition 10V (virtual), similar weekly overview announcements were sent out each Monday morning to bolster communication with both courses. Previous research has shown that there are also more opportunities for shy or introverted students to participate in an online environment, which can help connect students with the class (Paechter & Maier, Citation2010). Many online courses have students submit questions in a particular way, usually through email correspondence or an online question-and-answer platform. This approach can create a barrier to students getting their questions answered because they may not want to put in extra effort to type out their question and wait for a reply (Paul & Jefferson, Citation2019). In the present study, an online question-and-answer platform was utilized for both classes that was managed by the instructional team to help promote interaction between students and the instructional team, as well as student-student interaction. Future research is needed to compare differences in participation between these two classes to see if participation had an impact on academic performance.

Previous research has found that background characteristics—including gender, age, academic discipline, and prior education—of students enrolled in F2F versus online classes have been shown to differ, which could impact academic performance (Dumford & Miller, Citation2018). The present study found differences in a multitude of characteristics between the F2F and online classes, including gender, age, race and ethnicity, parental education level, enrollment status, transfer student status, and international student status. Students self-selected to enroll in either the F2F or online course, which could help explain these differences. These background characteristics may have contributed to the differences observed in academic performance, and further research needs to be conducted to determine the degree of impact for various factors.

There were also significant differences in final course grade, with the 2019 F2F class outperforming the online class by 3% and the 2020 online class outperforming the F2F class by 1.2%. This aligns with the mixed findings seen in the literature about whether F2F or online courses fare better in terms of final course grade (Soffer & Nachmias, Citation2018). However, the end of the 2020 quarter took place at the onset of the COVID-19 pandemic as institutions made the transition to emergency remote education. Students enrolled in the F2F class had the option of attending the last week of lecture or watching a recording of the lecture after it took place, and the final exam was moved to an online, closed-book format using a virtual proctoring company. These changes may have led to some students not receiving the material from the final two lectures. Furthermore, the disruption of having the exam taking place online with the proctoring company instead of via a paper format with in-person proctors may have been difficult for some students and contributed to the grade decline seen on the final exam. Research has found that students purposefully self-select enrolling in a F2F course because they desire the interaction with the instructor and their fellow students and the F2F learning environment aligns with their preferred learning style (Hurlbut, Citation2018). The primary reasons that students self-select enrolling in an online course, on the other hand, are scheduling conflicts, flexibility, and/or alignment with their preferred learning style (Hurlbut, Citation2018). Having the course format and expectation change toward the end of the course may have been jarring for students who had enrolled in a F2F course, especially at the onset of the pandemic. Although everyone was under distress at the onset of the pandemic, students in the online course had the expectation that their final exam would be held online, so there was no adjustment in the course delivery. These data help support the need for consistent structure in a class, as the usual activities were disrupted for F2F students, which may have contributed to the grade drop.

Although F2F and online education share many similarities, there are still significant differences between the two modalities (Paul & Jefferson, Citation2019). F2F instruction is dynamic and allows for real-time feedback. This medium provides students with an opportunity to partake in innovative discussion and questions, so they may feel a stronger connection to the class (Paul & Jefferson, Citation2019). Online education can be limiting because students are unable to receive immediate feedback from the instructor (Paul & Jefferson, Citation2019). One avenue for addressing this in Nutrition 10V was by embedding questions within the pre-recorded lecture videos, and students received real-time targeted feedback to reinforce the correct answer and explain why the other answer choices were incorrect. This approach helped students engage with the material and offered them an opportunity to see what types of questions arise from the material and to receive the feedback about the variable answers.

Overall, although the differences in change in nutrition knowledge were significant between the two classes, there was no significant difference between the final course grades for the F2F and virtual classes for the combined results. However, when conducting a sub-analysis by the quarter when the class was offered, significant differences in the final course grade were observed. Although these are small grade differences, there could be variations in student characteristics between quarters that may have contributed to these results that would need to be further explored, such as college preparedness and experience with taking a virtual class. Managing academic integrity in F2F and online classes is still an area that needs to be addressed in future education research.

Conclusions

The objective of this pilot study was to investigate differences in nutrition knowledge and course grades between students participating in the F2F and online versions of an introductory, college-level nutrition course. Students were found to have improved nutrition knowledge, regardless of course delivery format. However, students in the F2F class had a greater improvement in nutrition knowledge than students in the online class. When conducting a sub-analysis by the quarter when the class was offered, significant differences in final course grade were found, with the 2019 F2F class outperforming the online class and the 2020 online class outperforming the F2F class. These data help support the need for a reliable knowledge survey based on the course learning objectives to measure change in knowledge that cannot be captured through final course grade alone. Additionally, consistent structure in a class may contribute to student performance; during this study period, the usual activities were disrupted for F2F students at the onset of the pandemic. Although F2F and online education share many similarities, there are still significant differences that remain between the two modalities that warrant further research.

Supplemental material

Supplemental Material

Download Zip (54.7 KB)

References

  • Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Babson Survey Research Group and Quahog Research Group.
  • Bandara, D., & Wijekularathna, D. K. (2017). Comparison of student performance under two teaching methods—face to face and online. International Journal of Education Research, 12(1), 69–79.
  • Burkhardt, J. M., Kinnie, J., & Cournoyer, C. M. (2008). Information literacy successes compared: Online vs. face to face. Journal of Library Administration, 48(3–4), 379–389. https://doi.org/10.1080/01930820802289425
  • Castro, M. D. B., & Tumibay, G. M. (2019). A literature review: Efficacy of online learning courses for higher education institution using meta-analysis. Education and Information Technologies, 26(2), 1367–1385. https://doi.org/10.1007/s10639-019-10027-z
  • Daffin, L. W., Jr., & Jones, A. A. (2018). Comparing student performance on proctored and non-proctored exams in online psychology courses. Online Learning, 22(1), 131–145. https://doi.org/10.24059/olj.v22i1.1079
  • De Brey, C., Snyder, T. D., Zhang, A., & Dillow, S. A. (2021). Digest of Education Statistics 2019 (NCES 2021-009). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.
  • Dell, C. A., Low, C., & Wilker, J. F. (2010). Comparing student achievement in online and face-to-face class formats. Journal of Online Learning and Teaching, 6(1), 30–42.
  • Dhawan, S. (2020). Online learning: A panacea in the time of COVID-19 crisis. Journal of Educational Technology Systems, 49(1). https://doi.org/10.1177/0047239520934018
  • Driscoll, A., Jicha, K., Hunt, A. N., Tichavsky, L., & Thompson, G. (2012). Can online courses deliver in-class results? Teaching Sociology, 40(4), 312–331. https://doi.org/10.1177/0092055x12446624
  • Dumford, A. D., & Miller, A. L. (2018). Online learning in higher education: Exploring advantages and disadvantages for engagement. Journal of Computing in Higher Education, 30(3), 452–465. https://doi.org/10.1007/s12528-018-9179-z
  • Fischer, C., Xu, D., Rodriguez, F., Denaro, K., & Warschauer, M. (2020). Data on online and face-to-face course enrollments in a public research university during summer terms. Data Brief, 29, 105320. https://doi.org/10.1016/j.dib.2020.105320
  • Goodman, J., Melkers, J., & Pallais, A. (2019). Can online delivery increase access to education? Journal of Labor Economics, 37(1), 1–34. https://doi.org/10.1086/698895
  • Gudiño Paredes, S., Jasso Peña, F. D. J., & de La Fuente Alcazar, J. M. (2021). Remote proctored exams: Integrity assurance in online education? Distance Education, 42(2), 200–218. https://doi.org/10.1080/01587919.2021.1910495
  • Hussein, J. M., Yusuf, J., Deb, A. S., Fong, L., & Naidu, S. (2020). An evaluation of online proctoring tools. Open Praxis, 12(4), 509–525. https://dx.doi.org/10.5944/openpraxis.12.4.1113
  • Hurlbut, A. R. (2018). Online vs. traditional learning in teacher education: A comparison of student progress. American Journal of Distance Education, 32(4), 248–266. https://doi.org/10.1080/08923647.2018.1509265
  • Kebritchi, M., Lipschuetz, A., & Santiague, L. (2017). Issues and challenges for teaching successful online courses in higher education. Journal of Educational Technology Systems, 46(1), 4–29. https://doi.org/10.1177/0047239516661713
  • Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.
  • Neuhauser, C. (2002). Learning style and effectiveness of online and face-to-face instruction. American Journal of Distance Education, 16(2), 99–113. https://doi.org/10.1207/s15389286ajde1602_4
  • Paechter, M., & Maier, B. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. The Internet and Higher Education, 13(4), 292–297. https://doi.org/10.1016/j.iheduc.2010.09.004
  • Palvia, S., Aeron, P., Gupta, P., Mahapatra, D., Parida, R., Rosner, R., & Sindhi, S. (2018). Online education: Worldwide status, challenges, trends, and implications. Journal of Global Information Technology Management, 21(4), 233–241. https://doi.org/10.1080/1097198x.2018.1542262
  • Paul, J., & Jefferson, F. (2019). A comparative analysis of student performance in an online vs. face-to-face environmental science course from 2009 to 2016. Frontiers in Computer Science, 1, 7. https://doi.org/10.3389/fcomp.2019.00007
  • Russell, J.-e., Van Horne, S., Ward, A. S., Bettis, E. A., Sipola, M., Colombo, M., & Rocheford, M. K. (2018). Large lecture transformation: Adopting evidence-based practices to increase student engagement and performance in an introductory science course. Journal of Geoscience Education, 64(1), 37–51. https://doi.org/10.5408/15-084.1
  • Soffer, T., & Nachmias, R. (2018). Effectiveness of learning in online academic courses compared with face-to-face courses in higher education. Journal of Computer Assisted Learning, 34(5), 534–543. https://doi.org/10.1111/jcal.12258.
  • Stevens, G. J., Bienz, T., Wali, N., Condie, J., & Schismenos, S. (2021). Online university education is the new normal: But is face-to-face better? Interactive Technology and Smart Education, 18(3), 278–297. https://doi.org/10.1108/itse-08-2020-0181.
  • Summers, J. J., Waigandt, A., & Whittaker, T. A. (2005). A comparison of student achievement and satisfaction in an online versus a traditional face-to-face statistics class. Innovative Higher Education, 29(3), 233–250. https://doi.org/10.1007/s10755-005-1938-x.
  • Wei, H.-C., & Chou, C. (2020). Online learning performance and satisfaction: Do perceptions and readiness matter? Distance Education, 41(1), 48–69. https://doi.org/10.1080/01587919.2020.1724768
  • Wu, D. (2015). Online learning in postsecondary education: A review of the empirical literature (2013–2014). Ithaka S + R. https://doi.org/10.18665/sr.221027