1,191
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Improving retention while enhancing student engagement and learning outcomes using gamified mobile technology

, &
Received 02 Apr 2023, Accepted 22 Feb 2024, Published online: 05 Mar 2024

ABSTRACT

There is growing support to the argument that effective use of technology in blended learning initiatives, such as gamification, can influence student retention and combat low levels of student engagement as well as improve academic performance. The aim of this study is to investigate the effectiveness of a gamified mobile application (GMA) towards enhancing student engagement, improving student learning outcomes and increasing retention. Our study indicates significant improvement in student engagement and retention between users and non-users of a GMA for first year accounting students. Specifically, the study found users of GMA were more engaged with learning materials, their teachers, and their peers. The improved learning outcomes also motivated a greater number of students to persist and continue their studies. Furthermore, GMA-users provide feedback that its use increased their connectedness and propensity for engagement.

Introduction

Can the utilisation of a gamified mobile application (GMA) enhance student engagement, improve student achievement of learning outcomes, and increase retention? This research delves into this inquiry due to the critical importance of enhancing student engagement. An improved level of student engagement holds the potential to positively influence the overall learning experience. When students are actively engaged, there is a greater likelihood of their increased participation in studies, ultimately contributing to enhanced retention of knowledge and heightened motivation to learn (Arufe Giráldez et al., Citation2020; Ferriz-Valero et al., Citation2020; Parody et al., Citation2022).

The attainment of higher student achievement stands as a foundational objective in education; given the growing prevalence of technology in the classroom, this research is timely and extremely important. If GMAs prove to be conducive to superior learning outcomes, their utilisation could potentially cultivate more well-rounded and knowledgeable students. This study holds significance for accounting educators on a global scale as it examines whether incorporating a gamified mobile application (GMA) can boost student engagement, enhance the attainment of learning objectives, and contribute to higher retention rates.

Addressing the convergence of accounting education and technology, this research examines the impact of the introduction of innovative teaching methods with the potential to elevate the quality of accounting graduates. Our study and methodological approach address the research query of whether the integration of a GMA can enhance student engagement, improve the achievement of learning outcomes, and boost retention. This is accomplished by amalgamating quantitative data collection, specifically grade results, with qualitative data in the form of student perceptions. This combined approach aims to assess the influence of the GMA on these concepts.

The findings of this study lend support to the hypotheses proposing that incorporating a GMA yields positive outcomes in terms of student achievement in learning outcomes and heightened student engagement. The statistical outcomes indicate that the GMA played a pivotal role in enhancing student attainment of learning objectives and fostering increased motivation for engagement with peers and learning materials.

Contribution

This research significantly contributes to accounting education literature by addressing the intersection of accounting education and technology, making it highly relevant and noteworthy for accounting educators globally. The research adds to the ongoing discourse on the optimal preparation of accounting students for successful careers in a rapidly evolving profession, irrespective of the geographical location of their accounting studies. The exploration of the efficacy of GMAs in education mirrors the increasing role of technology in the contemporary classroom. To ensure continued engagement and subsequent improved learning outcomes it is important for educators to adapt to the ‘new’ tech savvy student.

The remainder of our paper is structured as follows: Firstly, a literature review which examines student engagement, active learning and learning technologies and gamification. This is followed by a discussion of our methodology which includes assessment before and after the GMA intervention, and how the choice of the GMA was decided upon and implemented. Results are then presented with the paper culminating in a discussion of results and conclusion.

Literature review

Student engagement

Student engagement is defined as the ‘time and effort students devote to educationally purposeful activities’ (Radloff & Coates, Citation2010, p. 1). Kahu (Citation2013) argues that engagement in higher education is an essential part of the learning process; educators must consider the engagement of students when determining the effectiveness of teaching and learning. Student engagement is widely acknowledged to be an important precursor to effective learning and academic success at all levels of education (Korobova & Starobin, Citation2015; Schmitz & Hanke, Citation2023; TEQSA, Citation2020; Thomas & Heath, Citation2014). Increased student engagement is also linked to retention (Meer & Chapman, Citation2014; Pechenkina et al., Citation2017).

Student engagement is not captured by a single concept but rather multiple, and sometimes complex activities. Hence, establishing a single model is problematic, as demonstrated by the body of work in this area (see Kahu, Citation2013). Recent research has emerged that attempts to bring together the complexity of student engagement into one model (e.g. Kahu, Citation2013; van der Meer et al., Citation2018). Kahu (Citation2013) framework of student engagement considers the impact of structural and psychosocial influences. Consequences of student engagement are considered as proximal (academic achievement and social satisfaction) and distal (retention, work success, lifelong learning, citizenship, and personal growth). Inspired by the work of Kahu (Citation2013), a model of student engagement developed by van der Meer et al. (Citation2018) expands student engagement as comprising campus and community engagement. At the heart of these models is academic engagement which captures cognition (deep learning), and psychosocial aspects of emotional/affective engagement (enthusiasm, interest and belonging), behavioural engagement (time and effort, interaction, participation), and relational engagement (connectedness). It is these concepts that to which this study leans on with respect to student academic engagement.

Active learning

In recent years, there has been an increasing emphasis on the importance of active learning methodologies in higher education – a constructivist approach (Knowles, Citation1973). This shift has been driven by a growing body of research that suggests that traditional methods of teaching, such as lecture-based instruction, may not be as effective as more interactive and collaborative approaches (Coates, Citation2010; Crosling et al., Citation2009; Merriam, Citation2017; Schmitz & Hanke, Citation2023). Research has shown that active learning methodologies can have a range of positive impacts on student learning outcomes, including improved retention of information, increased motivation and engagement, and higher levels of achievement (Arjomandi et al., Citation2018). According to David J. Shernoff et al. (Citation2014a, p. 167) ‘if engagement with learning arises from the reciprocal interaction between learners and a learning environment’ then a teacher’s ability to engage students’ rests with their ability to ‘create, shape, and influence the whole learning environment’ (Hamari et al., Citation2016; Krath et al., Citation2021; D. J Shernoff et al., Citation2014a; David J. Shernoff et al., Citation2014b).

Carvalho et al. (Citation2021) suggest that humans learn best when they are active (not passive) and engaged (not distracted), and when it occurs in a socially interactive context that is iterative (not merely repetitive) and fun (Carvalho et al., Citation2021). Carvalho et al. (Citation2021) put forward that it is important, in the new educational context that the pedagogical activity is oriented towards the adoption of techniques and activities that encourage student participation. They believe that innovative, active learning methodologies encourage students to seek knowledge autonomously, developing their creativity and critical analysis, through problem-solving.

Learning technologies and gamification

Various studies have found there is positive correlation between the use of educational technology and measures of engagement (Chen et al., Citation2010; Laird & Kuh, Citation2005; Lo, Citation2023; Sun et al., Citation2018). Some suggest that the effective use of educational technology in blended learning initiatives may influence student retention (Olelewe et al., Citation2019) and combat low levels of student engagement (Arbaugh, Citation2000; Bharucha, Citation2017; Luthans et al., Citation2016). Specifically, students may respond well to gamification designs (Denny, Citation2013; Krath et al., Citation2021). Zainuddin et al. (Citation2020) in their review of literature on gamification, put forward that the incorporation of gamification is thought to be a successful method for boosting students’ motivation and enhancing their educational experience, involvement, and achievements. They found numerous research studies have suggested that introducing elements of gaming, like badges, levels, and leaderboards, has a beneficial impact on learner engagement. Using gamification and multiple opportunities for students to join in and engage with activities, may allow students to develop a rhythm and pattern of behavioural engagement, which facilitate sustained engagement. Detractors, however, contend that this primarily generates external motivation rather than internal motivation. In other words, learners may complete a task solely to obtain a badge, rather than deriving satisfaction from acquiring new knowledge and skills (Zainuddin et al., Citation2020). Calls have been made for a greater understanding of the role that educational technology plays in affecting student engagement (Bond et al., Citation2020; Krause & Coates, Citation2008; Nelson Laird & Kuh, Citation2005; Zhoc et al., Citation2019).

Our study goes some way towards answering that call through investigation of student engagement, achievement of learning outcomes and retention in a first year introductory accounting subject. The subject determines student achievement of learning outcomes by assessing student performance in three major assessment pieces. The first of which is centred around student engagement and, is the assessment piece to which a GMA is introduced in this study. The Quitch GMA deployed in this study (discussed further below) took the instructor and student perspective in its original design which is suggested to be best practice by Chickering and Gamson (Citation1999).

To test whether there is indeed a positive relationship between the use of a GMA, student engagement, achievement of learning outcomes and retention we develop the following set of hypotheses:

Method

To answer whether student use of a gamified mobile application improves their engagement and/or assessment performance and/or retention, a study was conducted in an Australian, multi-campus university. Subjects delivered by its business school are taught across five campuses in three major cities.

Students enrolled in the introductory accounting subject attend weekly, two-hour, practical workshops in lieu of lectures and tutorials (Cheng et al., Citation2019). The workshops are supported with online learning material posted in the Learning Management System (LMS) including, videos, podcasts, activities, and readings. Students are required to complete these online activities prior to attending workshops so that they are prepared to participate in class activities. The subject is coordinated by a national academic manager who designs the LMS as well as populates the site with learning materials and activities. Each campus has its own lecturer who instructs students. All the subject teaching material is provided to the lecturers by the national academic manager. In essence, all students use the same LMS, have the same learning activities and are taught with the same learning material.

Why was Quitch chosen as the GMA?

In previous years, before the implementation of the current Assessment 1 – Student engagement, the course utilised a non-assessable textbook activity site for students to practice more intricate accounting concepts. The decision to explore an alternative approach stemmed from the academic manager’s aspiration to enhance student engagement, their achievement of learning outcomes, and improve retention. Additionally, the alteration in the pricing model by the textbook publisher played a role in this transition. The new pricing structure involved a fixed base fee of $5000AUD along with a per-student cost. Notably, during the study period, the entire expense of the textbook activity site was covered by the School, with no intention of shifting this cost burden to students either currently or in the future.

The academic manager was aware that historically, the textbook activity site was predominantly used by high achievers rather than the whole cohort which subsequently resulted in poorer performing students and non-completion of assessment (attrition); this prompted the manager to initiate a discussion with the Head of School and the finance manager regarding changing to a digital interactive learning tool. The academic manager believed a more interactive, and mobile tool would have increased utilisation by students and hence encourage increased engagement. Senior management requested the academic manager review potential interactive programmes that were more cost effective and that would not take up considerable workload in implementation. The academic manager reviewed several interactive systems available based on senior management needs but also those that would enable easy access by students, provided significant data analytics of student performance and engagement for lecturers and, that had received good ratings by student users.

Quitch™ is a gamified mobile application that assists educators in harnessing the inherent sociability and technological proficiency of higher education students to significantly enhance their learning experience and outcomes. Quitch bridges the gap between in class learning and self-learning by delivering valuable knowledge, quizzes and revision reminders directly to student’s mobile phones. Quitch also delivers an educator portal that provides comprehensive analytics to monitor student performance and engagement (Quitch, Citation2021). On reviewing Quitch as a potential GMA, the review found that educators receive real-time data on learner performance allowing for early intervention and that it is intuitive and simple to set up. Further analysis of Quitch found that students should experience a fun, engaging and motivating way to learn, along with personalised analytics and the ability to learn anytime, anywhere, meeting them wherever they are. Entrenched in Quitch’s gameplay is spaced repetition, which combats a human’s natural tendency to forget. To engage the students in their learning, gamified elements are added to the mobile application, such as points for getting questions correct, animated badges for achieving goals and a leader board to show individual performance against the class. The Quitch GMA enables the breakdown of new learning material into bite-sized chunks that can encourage students to revise regularly through short bursts of rapid-fire gaming. The GMA has the ability to gradually increase the interval between revision sessions, which commits information to a student’s long-term memory through the practice of active recall. The academic manager found that the focus of the Quitch GMA is on providing an engaging experience, real time feedback, interactivity and personalised learning.

In order to commence gaming, the student is invited to a private Quitch site that has been set-up by an educator with their institutional email. The invitation instructs them on how to download the application, register and get started. Students are not identified to their peers but are identified to the educator. Both the educator portal and the student portal visually highlight topics that students need to focus on with red indicating poor performance, while yellow and orange areas identify areas for improvement, and green indicates that they have mastered the content. Student reviews reported that this instant feedback empowers them as they know what areas to focus on. In setting up the Quitch GMA, the educator is required to schedule content to be pushed to students’ mobile devices (daily, weekly, monthly) to keep them motivated and engaged. The educator has the ability, in real time, to identify the content areas of difficulty for the cohort and therefore can respond quickly to ensure student needs are met, a student centric approach at scale. In addition, the educator can identify, again in real time, the students who are ‘at-risk’ or ‘not participating’ and reach out to these students through the mobile application on the students’ mobile devices to assist them with problem areas. From an educator’s perspective, the ability to provide individual feedback on performance at scale is very useful. The Quitch GMA also has standardised question templates for various disciplines (for example, already created templates for physics, accounting, anatomy etc.) that can be adopted as a time saving endeavour.

Given senior management and the academic manager’s requirements, the most cost effective and intuitive GMA, was Quitch™. At the time of implementation, the Quitch application fee was based on a fee per user at $7.50 per student plus an annual licence fee of $300. Including goods and services tax (GST) this was far less than the previous textbook activity website.

Assessment: how it was (before introducing the GMA)

The subject in which the intervention took place is a first-year introductory accounting unit that is delivered over a 12 week semester. It had three summative assessments of which assessment 1, had three sub-tasks that added up to a total of 20%; these were used to capture varying types of student engagement.

  1. Assessment 1 (AT1) – Student engagement:

    1. AT1.1: In-Class engagement (7%).

      • Completing In-Class activities and collaboration with educator and peers

    2. AT1.2: Discussion Forum questions (6%).

      • Responding to discussion forum questions on a weekly basis (no change was made to this assessment component for either cohorts in the study)

    3. AT1.3: Answering online Quizzes (7%).

      • Logging in to the LMS and answering set quizzes each week.

  2. Assessment 2 (AT2) – Financial statement analysis (FSA) and business report. Students were required to analyse financial statements using various ratios and write a business report of their findings. The report was to be submitted through TurnItIn (30%).

  3. Assessment 3 (AT3) – Final exam covering all topics except FSA. A traditional exam with conceptual and problem-based questions that was invigilated online with Proctorio (50%).

Student engagement was assessed from weeks two through to 11, a total of 10 weeks. Student responses to quizzes and discussion forum were not graded on correctness but rather that the sub-criteria were attempted. Week 1 was not assessed but rather used as a practice week. Students who used this method of assessment became the Control Group (discussed further below).

The intervention: setting up the GMA

To test whether the implementation of a GMA improves student engagement, achievement of learning outcomes across various assessment and retention, the GMA needed to be developed. The goal of the academic manager was to have students feel motivated and to have easy access. The plan was to add questions that were not difficult but were meaningful, goal-directed and skill-building, and encourage students to research what the answer might be.

The learning objectives of the subject of the study did not quite fit the standardised accounting site offered by Quitch so the academic manager chose to create a hybrid site (a combination of the standardised site and new areas specifically related to the subjects learning outcomes). In setting up Quitch (the GMA) the national academic manager first developed a master Quitch site that had 10 sections (one for each assessable week). Each student was automatically given a pseudo name so that their real name was not revealed to other students.Footnote2 Ten questions were then added to each of the various weekly topics (10 weeks of 10 quiz questions), these question sets were named ‘Quitch on-the-go’. The questions consisted of multiple choice, fill in the blank and true or false. Questions allowed for hints (text page references) and answer explanations. It was hoped that the text page references encourage students to read the textbook to understand concepts.

Using the option of periodic release in the GMA, questions were set to release periodically throughout the week for each topic. Students received notifications through the mobile app when a new question was released. They could go on their phones instantly and answer or they could choose to wait until they had looked up relevant page numbers in the textbook. If students chose an incorrect answer, they received instant feedback with a hint to help them get it correct, if they chose a correct answer, they also receive a motivational congratulatory message/badge.

The second set of quizzes created were those to be used in class and titled ‘Quitch In-Class’. Three to five questions were added for each week. These are released by lecturers during classes as specific concepts are discussed. They are not scheduled to open across all campuses at the same time, given various campuses differ in their progression of learning material throughout a workshop. Only students who attend workshops receive marks for attempting In-Class questions.Footnote3 Answering these questions correctly also gave students who attended workshops bonus scores in the inherent competition leader board;Footnote4 this in turn, increased students’ motivation to attend workshops.

After the academic manager had created the master Quitch site it was duplicated for each of the five campuses. Enrolled students for individual campuses were loaded to each site and they were sent an invitation to their institutional email accounts with a link to download the application to their phones and to join their campus Quitch sites.

Overall, it took the academic manager about a week to set up a master site with weekly sections, questions, release timing and to copy the master site to all individual campus sites that would be monitored by campus lecturers. Time that would be well worth the investment if the Quitch GMA was successful in improving student engagement, student achievement of learning outcomes and retention.

Assessment: how it emerged

As with the assessment structure prior to the intervention, responses to quizzes or discussion forums were not graded based on correctness but rather that students attempted each task. The control group is the non-GMA users (Assessment 1: How it was) and the test group is the GMA-users (Assessment: How it emerged). The test group cohort were taught in the year following the control group cohort. All assessment items were similar for the control and test groups apart from two assessment 1 sub-criterion which was subject to the intervention. There were no changes to assessment weightings for either the control group or the GMA user group. summarises the revised assessment information and highlights the differences between the groups for both the non-users and the users of GMA.

Table 1. Assessment summary.

The highlighted differences presented in indicate the change between AT1.1 (In-Class engagement) and AT1.3 (Online quizzes) for the control and test groups, this is discussed further below:

AT1.1

Control group (non-GMA users): In-Class engagement (AT1.1) was measured by student participation in the classroom (face-to-face or online), and included the lecturer observing, for example, how many times they asked questions, how many activities they completed in class and whether they were actively collaborating with their peers. This method for evaluating student In-Class engagement was subjective and marking was not moderated by other lecturers. Lecturers would allocate each student points firstly for being in attendance (1 mark) and secondly, on the basis of participation on a Likert type scale where 0 no participation to 5 often participated. This was assessed for 10 weeks from week 2 and weighted at 7% at the end of the semester.

Test group (GMA users): In the second semester, the test group was introduced to a gamified mobile application (Quitch™) for use In-Class in an attempt to increase student engagement in workshops. Students again were given a mark for being in attendance however, rather than subjectively measuring their engagement in class, students were provided with quiz questions ‘Quitch In-Class’ (discussed earlier) which were released randomly and periodically throughout workshops to students who were in attendance to answer. The academic manager would view results of the quizzes after classes to note which students answered quizzes. The academic manager then allocated a point for students who answered, not whether the answers were correct.

Note: AT1.1 for the control group was assessed qualitatively while for the test group, it was assessed quantitatively – that is, the subjectiveness was removed thereby removing any potential bias.

AT1.3

Control group (non-GMA users): The third criterion were the online Quizzes. Online Quizzes were created in the subject’s LMS, and students were required to attempt all questions for each Quiz for a given week. The quiz was open from Monday to Monday and students could answer anytime during that period.

Test group (GMA users): The test group was introduced to a gamified mobile application (Quitch™) for online weekly Quizzes ‘Quitch on-the-go’ (discussed earlier). Each week ten questions were pushed out periodically throughout the week to students’ mobile phones and students answered the questions wherever they were and whenever they wanted to without a need to log in to the subject’s LMS. Once released the questions were open from Monday to Monday and students could answer anytime during that period. Again, the academic manager allocated points for students who answered, not whether the answers were correct.

Data collection

At the end of the test group’s semester, and after all assessment had been completed, individual assessment results for all students of both the control and test groups were collated from the institutions secure site.Footnote5

Student perceptions: GMA user

To further our knowledge of student’s perception of using the GMA, a survey was conducted of the test group (GMA users). Student engagement does not only focus on engaging with learning materials. Kahu (Citation2013) and van der Meer et al. (Citation2018) models emphasise the many facets that make up student engagement. Psychosocial aspects encompass enthusiasm, interest and belonging, time and effort, interaction, participation, and connectedness. Questions include, for example: ‘Using Quitch gave me a strong sense of belonging in the unit’; ‘Using Quitch encouraged me to want to know more about weekly topics’ and ‘Using Quitch made me felt connected to my peers’.Footnote6 Following ethics approval by the institution, the survey was created providing students with relevant information about the study and advising that their responses were confidential. The survey was conducted after all pieces of assessment had been submitted for the semester.

Student users were asked to provide responses to the series of survey questions which captured the various concepts related to the use of the GMA. These were asked on a five-point Likert scale where 5 = To a great extent, 4 = More often than not, 3 = To an average extent, 2 = To some extent, and 1 = Not at all.

After the closing date of the perception survey, the survey responses and the individual assessment results for 201 cases (all students of both cohorts) were imported into SPSS version 29. Before conducting statistical tests, the data were screened for irregularity. Using the Missing Values Analysis procedure, (listwise method to estimate statistics), we identified that AT2 and AT3 had more than 5% of cases with missing values. We conducted further testing to ensure our data analysis would be robust. Using the Separate Variance t-tests, we found systematic relationships between the missing values on AT2 and two other variables: Total (Overall grade) and In-Class (AT1.1), p < =  0.05. As a result, 35 cases were removed. We then classified the data by term (teaching period) and gender and assessed the cases for outliers. After identifying the outliers, we proceeded to assess them for any potential errors. The assessment found four outliers that were classified as ‘extreme’, these were removed from the data set resulting in 162 cases as our final sample size. Results of our analyses is presented next.

Results

For Assessment 1, in the non-GMA semester (Semester 1), the In-Class engagement criterion (AT1.1) was evaluated by a students’ propensity to complete In-Class activities and collaboration with the educator and peers (subjective evaluation). In the GMA semester (Semester 2), students were asked to engage with the quiz questions which were released periodically throughout the class (quantifiable evaluation) using Quitch. Although increased collaboration with peers were observed in Semester 2, this was not evaluated. In the non-GMA semester, the third engagement criterion Quizzes (AT1.3) required students to log into the subject’s LMS and answer 10 weekly quizzes. In the GMA semester, quiz questions were released and pushed to students’ mobile phones throughout the week for them to answer via the Quitch App on their phones. The second criterion – answering Discussion Forum questions (AT1.2) was the same for both semesters, as were Assessment 2 and Assessment 3. Descriptive statistics for each of the variables are provided in .

Table 2. Descriptive statistics for GMA and non-GMA including female and male users for all assessment variables including sub-components.

The study examines the link between engagement and student outcomes in terms of academic results across both cohorts.Footnote7 The mean differences are presented in .

Table 3. Differences in means between GMA users and Non-GMA users.

Hypothesis 1: Student engagement (AT1)

An independent-samples t-test was performed to compare the means of the two cohorts (GMA users and Non-GMA users) in overall student engagement grade performance. This analysis addresses H1a: The use of GMA increases student achievement of Assessment 1 learning outcomes. The results indicate that GMA users (M = 15.0, SD = 3.5) had a higher mean than Non-GMA users (M = 11.6, SD = 5.9). The mean difference for overall student engagement between GMA users and Non-GMA is significant t(160) = 4.48, p = <0.001.

To determine differences for each of the sub-components of student engagement, further independent-samples t-tests were performed comparing the means of the two cohorts for each item. This analysis addresses H1b: The use of GMA increases student achievement of Assessment 1 sub criterion: (a) In-Class engagement; (b) Discussion Forum engagement; (c) Quiz engagement. The descriptive statistics () indicate differences in the means of all sub-components of student engagement for non-GMA users and GMA users with GMA users consistently reporting higher means. As shown in , GMA users were more engaged in In-Class, t(160) = 4.68, p = <0.001, and with Quizzes, t(160) = 5.09, p = <0.001, the two assessments items directly subjected to the teaching intervention. To explore the weekly differences in the sub-engagement criteria of student engagement (AT1.1;1.2;1.3), each was examined separately to determine differences between cohorts. The analysis of the sub-components and their means are reported below. The independent-samples t-tests results at a week-level are available on request.

For both cohorts, each weekly In-Class engagement (AT1.1) item was evaluated out of two: for Non-GMA users, one for attendance and one for engaging with peers and lecturer; and for GMA users, one mark for attempting In-Class Quitch questions (Quitch In-Class) and one mark for attendance. Our analysis of the mean differences for In-Class engagement found significant differences for all weeks, except weeks 3 and 6, with GMA users identifying with the higher mean.

Weekly Discussion Forum posts (AT1.2) were evaluated out of two: one for a topic attempt and one for business communication. This sub-component of student engagement was administered the same way between both cohorts. Our significant findings indicate that GMA users had higher engagement in Discussion Forum posts only in Weeks 2 and 3.

A Paired-Samples T test was performed to determine the effect of the GMA on the means between two different methods of engagement, i.e. the weekly In-Class engagement and the weekly Discussion Forum posts. The paired differences for the GMA-users were significantly positive (higher means for In-Class engagement) for all pairs. Whereas the non-GMA users had mixed results. The Paired-Samples T test results are available on request.

The third criterion of student engagement was weekly Quizzes (AT1.3) consisting of 10 questions. Student engagement was based on the number of quiz questions attempted rather than whether they were answered correctly. This component was worth 5 marks each week if they completed all questions. Our analysis found significant differences between the cohorts with GMA consistently having higher means in all weeks except week 7.

Hypothesis 2: Student outcomes (AT2)

Assessment 2 related to financial statement analysis and report writing and encompassed one week in the curriculum. This topic was not repeated in a systematic manner as other curriculum. To determine differences in achievement of Assessment 2 learning outcomes between GMA users and non-GMA users, an independent-samples t-test was performed to determine the effect of the GMA on students’ achievement of AT2. This analysis addresses H2: The use of GMA increases students’ achievement of Assessment 2 learning outcomes. The results in indicate that there was no significant difference between the means of Non-GMA users (M = 24.6, SD = 7.4) and GMA users (M = 23.1, SD = 3.5), t(160) = −1.57, p = 0.120.

Hypothesis 3: Student outcomes (AT3)

To determine differences in achievement of Assessment 3 learning outcomes between GMA users and non-GMA users an independent-samples t-test was performed to determine the effect of the GMA on students’ achievement of AT3. This analysis addresses H3: The use of GMA increases students’ learning outcomes of Assessment 3. The results in indicate that there was no significant difference between the means of GMA users (M = 28.5, SD = 7.6) and Non-GMA users (M = 27.7, SD = 6.1), t(160) = .77, p = 0.442.

Hypothesis 4: Student outcomes (Overall)

To determine differences in overall grade performance between GMA users and non-GMA users an independent-samples t-test was performed to determine the effect of the GMA on students’ overall performance. This analysis addresses H4: The use of GMA increases students’ overall achievement of all learning outcomes. The results in indicate that there was no significant difference between the means of GMA users (M = 66.7, SD = 10.1) and Non-GMA users (M = 63.9, SD = 12.4), t(160) = 1.59, p = 0.114.

Hypothesis 5: The use of GMA increases retention of students in subject

To determine whether the use of a GMA increases retention of students in the subject, we conducted correlation analysis. Our proxy variable for retention of students in the subject is the completion of the final assessment (AT3), we present the results for all later assessments in .

Table 4. Correlation between AT1 (Student Engagement) and AT2 (Financial statement analysis (FSA) and business report) and AT3 (Final Exam) for GMA users (n = 80).

As evidenced in , there are positive correlations between student engagement (AT1) and all later assessments, AT2 and AT3, but these relationships are only statistically significant for GMA users (AT2, r = 0.201;)(AT3, r = 0.254).

Student perceptions: GMA user

As discussed under Methods above, student GMA users were given a series of concept statements and asked to rate these concepts on a five-point Likert scale where 5 = To a great extent, 4 = More often than not, 3 = To an average extent, 2 = To some extent, and 1 = Not at all. The response rate for the survey was 73.6% of users. The survey responses were subsequently imported to SPSS and analysed. presents these findings.

Table 5. Perception survey results GMA users (n = 59).

Additional analyses

Although not specifically part of our analysis, examination was undertaken to determine if there were any differences between male and female GMA users and non-GMA users for both semesters.

  1. Differences between female users and non-users for each assessment and overall ().

  2. Differences between male users and non-users for each assessment and overall ().

  3. Differences between female and male non-users for each assessment and overall ().

  4. Differences between female and male users for each assessment and overall ().

To determine differences in student outcomes between female GMA users and non-GMA users an independent-samples t-test was performed to determine the effect of the GMA on female participants in the two cohorts. The results in indicate that female GMA users (M = 15.9, SD = 3.1) had higher grades for AT1 than female non-GMA users (M = 13.7, SD = 5.2), t(64) = 2.1, p = 0.043 and female GMA users were more engaged than female non-GMA users for In-Class activities, t(64) = 2.2, significance at p = 0.014, and with Quizzes, t(64) = 8.437, p = 0.020.

Table 6. Table of differences in means between genders.

Similar effects were reported for male participants. The results from the independent-samples t-tests () indicate that male GMA users (M = 14.5, SD = 3.4) had higher grades for AT1 than male non-GMA users (M = 10.0, SD = 6.0), t(94) = 4.5, significance at p = <0.001 and male GMA users were more engaged in In-Class activities, t(94) = 4.4, significance at p = <0.001, and with Quizzes, t(94) = 5.0, significance at p = <0.001.

Analyses were also conducted to determine if there were differences between male and female GMA users and non-GMA users for each piece of assessment, as well as the overall grade. presents the findings.

identifies that female non-GMA users had higher engagement grades to males for AT1 and across all sub criterion (AT1, t(80) = 2.9, significant at p = 0.005; In-Class engagement, t(80) = 2.5, significant at p = 0.013; Discussion Forum, t(80) = 2.4, significant at p = 0.018, and Quizzes, t(80) = 2.7, significant at p = 0.007). In the GMA user group, female users (M = 70.0, SD = 10.6) had a higher overall grade for all assessment (t(78) = 2.3, significant at p = 0.022) than male GMA users (M = 64.7, SD = 9.3). The effect size, as measured by Cohen’s d, indicated a medium effect for the female users in AT1, AT1.1 and AT1.3, and a large effect size for the male users (AT1 d = 0.924, AT1.1 d = 0.905 and AT1.3 d = 1.03).

Discussion of results

Our study answered the call for greater understanding of the role that educational technology plays in affecting student engagement through our research to answer whether the utilisation of a GMA can enhance student engagement, improve student achievement of learning outcomes and retention. As endorsed by Kahu (Citation2013), a range of learning activities were implemented to support student learning throughout the semester. The following discussion is linked to the study’s hypotheses.

Hypotheses 1–4

The results of the various independent-samples t-tests suggest a significant increase in overall student engagement (AT1), and specifically within In-Class engagement (AT1.1) and Quizzes (AT1.3) (see ). The improved results in these sub-criterions not only support prior research (see Zainuddin et al. (Citation2020)) that a GMA enhances student engagement but also demonstrates its superiority over non-GMA strategies such as requiring students to post in the Discussion Forum (AT1.2). Additional evidence from paired-samples t-tests, comparing means for GMA-users between AT1.2 and the other two sub-criterions (AT1.1 and AT1.3) respectively (results available on request), further validates increased engagement for the assessment items using the GMA (AT1.1 and AT1.3). Although the improved outcomes observed with GMA use could partly be due to increased monitoring through the platform, it was also noted that using the GMA in the classroom often prompted students to actively participate in collaborative discussions with their peers and lecturers. The insignificant results for AT1.2 might stem from the cumbersome access process to the Discussion Forum. Students must log into the LMS and navigate through multiple clicks before they can participate in the Discussion Forum. Despite posting questions that should encourage engagement, we wonder if this method of engagement no longer appeals to students. Further, as discussed above, Quiz results consistently showed significance in contrast to Discussion Forum posts. This raises the question whether engagement in Discussion Forums could be improved by operationalising forums like a ‘chat’ function, delivering discussion prompts directly to students’ mobile phones, like an SMS? Food for thoughtFootnote8.

The implementation of the GMA in this study did not yield statistically significant changes in student outcomes across the assessment components AT2, AT3, and overall. This finding, while initially surprising, opens the door for further enhancements in our teaching methodologies. Given that AT2 covers only one week of learning material during the semester, no significant difference was anticipated in its outcomes. AT1 and AT3, on the other hand, concentrate on a good portion of the semester and account for 70% of the overall grade, thus we expect the GMA to show more pronounced differences in AT3 and overall. This aligns with previous research, such as (Beatson et al., Citation2020; Schmitz & Hanke, Citation2023) that student engagement is a key influence in student achievement and motivation. Ng and Lo (Citation2023) found that the introduction of a gamified flipped classroom significantly improved student engagement and that student learning performance was sustained throughout the study. Our analysis of AT1 showed significant differences for engagement with the subject and its learning materials (see ), leading us to expect improvements in the achievement of learning outcomes for AT3 and overall. However, contrary to these expectations and the broader literature, our results indicate that the GMA had little effect on the transmission and reception of substantive knowledge, as indicated by the results for AT3 and overall (see ). However, it did foster students’ ‘academic engagement’ (van der Meer et al., Citation2018) as evidenced by the AT1 results. This finding is similar to some studies examined by Zainuddin et al. (Citation2020) (for example, Ding et al., Citation2017; Hassan et al., Citation2021; Huang & Hew, Citation2018). Thus, while the statistical significance of the results is limited, the real-world impact for educators, particularly the increase in academic engagement and higher student retention (refer discussion of H5 below), underscores the GMA’s potential to enhance student learning, and signals an area for further research.

Student perceptions: GMA user

The perception survey responses reported means greater than the midway point (see ), indicating that GMA-users positively rated their experience with Quitch, with enhanced engagement with the unit, learning materials, and connections with their peers and teachers. The survey results coupled with the other findings in this study suggest that the GMA not only improved student enjoyment but also led to a significant improvement in engagement (AT1) and retention of our GMA users. These findings support Fredricks et al. (Citation2004) argument that if students feel motivated and connected to their teachers and peers, they are more likely to be engaged in learning. From an educator’s perspective, the positive student psychological and academic outcomes observed in this study are extremely encouraging and warrant further investigation into the use of GMA’s.

Hypothesis 5: The use of GMA increases retention of students in subject

Analysis was conducted to determine the number of students for both non-GMA users and GMA users who submit each assessment. Given this institution’s lower entry requirements and its greater population from disadvantaged communities, this subject developed a pattern where students often disengaged and would fail to submit later assessments if they struggled or felt disconnected from the learning material. However, our results revealed a noteworthy trend: GMA users were more likely to submit later assessments (AT2 and AT3) compared to non-GMA users (95.8% vs. 83% for AT2; 87.4% vs. 83% for AT3). Pearson correlation tests further underscored this by showing a statistically significant positive correlation for GMA users between student engagement (AT1) and completing AT2 and AT3 (see ). This may be attributable to the GMA engaging the students more in learning activities, resulting in students feeling more motivated and connected to their teachers and peers (as indicated by the student perception survey). The rise in assessment completion rates associated with GMA use is promising. The higher student retention, particularly with a larger component of students from disadvantaged communities, means a broader cohort of students can increase their knowledge base and gain grade points from completing the subject. This outcome is educationally significant, despite the modest overall mark improvement.

Additional analysis

Although both genders in the GMA cohort reported higher means in all assessment items apart from AT2, it is interesting to note that the increase in means for the assessment items directly impacted by the GMA (AT1, AT1.1 and AT1.3) were all statistically significant (see ) and there were less variability in the means when comparing the two cohorts (see ). The large effect size for male users suggests a notable and potentially meaningful difference in male student engagement when using a GMA and may offer additional value particularly in addressing the unique learning needs and preferences of male students. Possibly, GMAs could benefit weaker male students in particular by motivating them to persist with later assessments where previously they might have dropped out, a finding that warrants further research.

Conclusion

The study found a significant improvement in student engagement with the use of the GMA. This was further supported by student perception survey results indicating that students enjoyed the use of the GMA and that it had a positive influence on their relationship with peers and teachers, and the learning material. The overall positive experienced with using the GMA also motivated a greater number of students to go on and complete later assessment as opposed to dropping out resulting in the GMA being retained for further semesters. The results, however, do not support the hypothesis that a GMA improves achievement of learning outcomes for the final assessment and overall achievement however, we did see an increase in the students who completed. This is contrary to prior research and to the expected outcome of this study. The findings have prompted the national unit manager to review the current structure and format for a more interactive final assessment. The statistical findings indicate increased student engagement in both In-Class and Quiz components of Assessment 1: Student engagement. Specifically, GMA usage, as seen in In-Class engagement and Quizzes, surpasses traditional methods like contributing to Discussion Forums. Our analysis comparing means between Discussion Forums and Quizzes further supports this, revealing higher In-Class engagement for GMA users. Notably, Quitch quiz responses in class often led to collaborative discussions among students and instructors. The non-significant results for the Discussion Forum postings may stem from the cumbersome process of accessing them through the LMS. This raises questions about the method’s appeal or if it’s too time-consuming. The consistent significance of Quiz results versus non-significant Discussion Forum posts suggests potential alternative approaches, like implementing a ‘chat’ function akin to SMS sent directly to students’ mobile phones – a consideration for LMS developers.

The potential benefits of using GMAs in university subjects, specifically in terms of assessment design warrants further research. Traditional university courses often rely on a small number of large assessments, but GMAs suggest that multiple, small assessments or activities could be developed in lieu of this option. The opportunity to be active and engaged was found to be a key factor in student motivation and many small activities, as suggested by gamification, may be more effective than a small number of larger assessments. This idea is consistent with the broader principle of gamification, which emphasises the importance of providing structured opportunities for participants to engage and compete in a fun way that motivates behavioural change.

This research makes a meaningful contribution by delving into the nexus of accounting education and technology. It introduces creative teaching methods and has the potential to improve the quality of accounting graduates. Moreover, it adds to the ongoing conversation about the optimal preparation of accounting students for successful careers in a rapidly evolving profession.

It is important to note that while the novelty effect of a gamified mobile application may have played a role in the increased engagement, it is not the only factor contributing to the positive outcomes. The design and structure of the platform, such as the ability to provide regular and small activities, leader boards, and badges, and the convenience of a mobile platform, are all elements of gamification that can motivate and encourage engagement in the learning process. Moreover, the positive results observed in this study suggest that the use of gamification techniques could be applied to other educational contexts to increase student engagement and motivation. Given our perception survey findings and the significant positive outcomes for students, further studies of the impact of GMA and on the psychosocial aspects of various student engagement models is an important area of future study to further our understanding of the impact of GMAs.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Grainne Oates

Grainne Oates is the CEO and founder of Quitch, the company that developed the GMA used as a teaching intervention in this study. However, Dr Oates is not employed by the higher education institution where this study was conducted and was not involved in implementing the GMA, gathering the data or conducting the analysis. This research did not receive any specific grant funding from Quitch™ for the project.

Notes

1 We use increased submission of the final assessment as a proxy for increased retention of students in the subject.

2 Students may have revealed their pseudo name to other students; however, this was not known by the lecturer.

3 In-Class questions were released during scheduled class times and opened for only 90 s. Students who were not in class but answered a question within the time frame were found by cross checking attendance records and the GMA educator portal, these students were not awarded the ‘In-Class’ engagement mark as they were not in attendance at class.

4 Quitch also has an inherent competition feature, where scores are given for each correct answer over the semester and added to a Leader Board resulting in an ultimate winner at the end of semester. Students were able to see where they were placed (pseudo names) in the leader board in their mobile app. The competition was not part of assessment one but was an added incentive to encourage students to be engaged with the Quizzes and the textbook.

5 Approval had been obtained from senior management of the institution.

6 A full copy of the survey instrument is available on application to the authors.

7 Differences between campuses are not analysed given all students are taught with the identical learning material.

8 The national manager has since removed the discussion forum component of the engagement assessment and has now incorporated a chat function which is operationalised through the LMS. It is early days however, there is so far positive results of increased student interaction using this method.

References

  • Arbaugh, J. (2000). How classroom environment and student engagement affect learning in Internet-based MBA courses. Business Communication Quarterly, 63(4), 9–26. https://doi.org/10.1177/108056990006300402
  • Arjomandi, A., Seufert, J., O’Brien, M., & Anwar, S. (2018). Active teaching strategies and student Engagement: A comparison of traditional and non-traditional business students. Journal of Business Education and Scholarship of Teaching, 12(2), 120–140.
  • Arufe Giráldez, V., Sanmiguel-Rodríguez, A., Ramos Álvarez, O., & Navarro-Patón, R. (2020). Can gamification influence the academic performance of students?. Sustainability, 14(9), 5115. https://doi.org/10.3390/su14095115
  • Beatson, N., Gabriel, C., Howell, A., Scott, S., Van der Meer, J., & Wood, L. (2020). Just opt in: How choosing to engage with technology impacts business students’ academic performance. Journal of Accounting Education, 50(100641), 1–17. https://doi.org/10.1016/j.jaccedu.2019.100641
  • Bharucha, J. (2017). Building student engagement through collaborative practice in business management education. International Journal of Virtual and Personal Learning Environments, 7(2), 1–12. https://doi.org/10.4018/IJVPLE.2017070101
  • Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: a systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), 1–30.
  • Carvalho, A., Teixeira, S., de Campanella, L., & Costa, T. (2021). Pedagogical innovation in higher education and active learning methodologies – a case study. Education + Training, 63(2), 195–213. https://doi.org/10.1108/ET-05-2020-0141
  • Chen, P.-S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54(4), 1222–1232. https://doi.org/10.1016/j.compedu.2009.11.008
  • Cheng, L., Ritzhaupt, A., & Antonenko, P. (2019). Effects of the flipped classroom instructional strategy on students’ learning outcomes: A meta-analysis. Educational Technology Research and Development, 67(4), 793–824. https://doi.org/10.1007/s11423-018-9633-7
  • Chickering, A. W., & Gamson, Z. F. (1999). Development and adaptations of the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 1999(80), 75–81. https://doi.org/10.1002/tl.8006
  • Coates, H. (2010). Development of the Australasian survey of student engagement (AUSSE). Higher Education, 60(1), 1–17. https://doi.org/10.1007/s10734-009-9281-2
  • Crosling, G., Heagney, M., & Thomas, L. (2009). Improving student retention in higher education: Improving teaching and learning. The Australian Universities’ Review, 51(2), 9–18.
  • Denny, P. (2013). The effect of virtual achievements on student engagement. Conference on Human Factors in Computing Systems - Proceedings, 763–772. https://doi.org/10.1145/2470654.2470763
  • Ding, L., Kim, C., & Orey, M. (2017). Studies of student engagement in gamified online discussions. Computers & Education, 115, 126–142. https://doi.org/10.1016/j.compedu.2017.06.016
  • Ferriz-Valero, A., Osterlie, O., Garcia Martinez, S., & Garcia-Jaen, M. (2020). Gamification in physical education: Evaluation of impact on motivation and academic performance within higher education. International journal of environmental research and public health, 17(12), 4465. https://doi.org/10.3390/ijerph17124465
  • Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059
  • Hamari, J., Shernoff, D. J., Rowe, E., Coller, B., Asbell-Clarke, J., & Edwards, T. (2016). Challenging games help students learn: An empirical study on engagement, flow and immersion in game-based learning. Computers in Human Behavior, 54, 170–179. https://doi.org/10.1016/j.chb.2015.07.045
  • Hassan, M. A., Habiba, U., Majeed, F., & Shoaib, M. (2021). Adaptive gamification in e-learning based on students’ learning styles. Interactive Learning Environments, 29(4), 545–565. https://doi.org/10.1080/10494820.2019.1588745
  • Huang, B., & Hew, K. F. (2018). Implementing a theory-driven gamification model in higher education flipped courses: Effects on out-of-class activity completion and quality of artifacts. Computers & Education, 125, 254–272. https://doi.org/10.1016/j.compedu.2018.06.018
  • Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758–773. https://doi.org/10.1080/03075079.2011.598505
  • Knowles, M. S. (1973). The adult learner: A neglected species. Gulf Pub. Co.
  • Korobova, N., & Starobin, S. S. (2015). A comparative study of student engagement, satisfaction, and academic success among international and American students. Journal of International Students, 5(1), 72–85. https://doi.org/10.32674/jis.v5i1.444
  • Krath, J., Schürmann, L., & von Korflesch, H. F. O. (2021). Revealing the theoretical basis of gamification: A systematic review and analysis of theory in research on gamification, serious games and game-based learning. Computers in Human Behavior, 125, 106963. https://doi.org/10.1016/j.chb.2021.106963
  • Krause, K. L., & Coates, H. (2008). Students' engagement in first-year university. Assessment and evaluation in higher education, 33(5), 493–505. https://doi.org/10.1080/02602930701698892
  • Laird, T. F. N., & Kuh, G. D. (2005). Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education, 46(2), 211–233. https://doi.org/10.1007/s11162-004-1600-y
  • Lo, C. K. (2023). Strategies for enhancing online flipped learning: A systematic review of empirical studies during the COVID-19 pandemic. Interactive Learning Environments, (ahead-of-print), 1–29. https://doi.org/10.1080/10494820.2023.2184392
  • Luthans, K. W., Luthans, B. C., & Palmer, N. F. (2016). A positive approach to management education: The relationship between academic PsyCap and student engagement. The Journal of management development, 35(9), 1098–1118. https://doi.org/10.1108/JMD-06-2015-0091
  • Meer, N. M., & Chapman, A. (2014). Assessment for confidence: Exploring the impact that low-stakes assessment design has on student retention. The International Journal of Management Education, 12(2), 186–192. https://doi.org/10.1016/j.ijme.2014.01.003
  • Merriam, S. (2017). Adult learning theory: Evolution and future directions. PAACE Journal of Lifelong Learning, 26, 21–37.
  • Ng, L.-K., & Lo, C.-K. (2023). Enhancing online instructional approaches for sustainable business education in the current and post-pandemic era: An action research study of student engagement. Education Sciences, 13(1), 42. https://doi.org/10.3390/educsci13010042
  • Olelewe, C. J., Agomuo, E. E., & Obichukwu, P. U. (2019). Effects of B-learning and F2F on college students’ engagement and retention in QBASIC programming. Education and Information Technologies, 24(5), 2701–2726. https://doi.org/10.1007/s10639-019-09882-7
  • Parody, L., Santos, J., Trujillo-Cayado, L. A., & Ceballos, M. (2022). Gamification in engineering education: The use of classcraft platform to improve motivation and academic performance. Applied sciences, 12(22), 11832. https://doi.org/10.3390/app122211832
  • Pechenkina, E., Laurence, D., Oates, G., Eldridge, D., & Hunter, D. (2017). Using a gamified mobile app to increase student engagement, retention and academic achievement. International Journal of Educational Technology in Higher Education, 14(31), 1–12. https://doi.org/10.1186/s41239-017-0069-7
  • Quitch. (2021). Quitch. Quitch. Retrieved June from https://www.quitch.com/
  • Radloff, A., & Coates, H. (2010). Doing more for learning: Enhancing engagement and outcomes: Australasian survey of student engagement: Australasian student engagement report (ACER). ACER.
  • Schmitz, B., & Hanke, K. (2023). Engage me: Learners’ expectancies and teachers’ efforts in designing effective online classes. Journal of Computer Assisted Learning, 39(4), 1132–1140. https://doi.org/10.1111/jcal.12636
  • Shernoff, D. J., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. (2014a). Student engagement in high school classrooms from the perspective of flow theory. In M. Csikszentmihalyi (Ed.), Applications of flow in human development and education (pp. 475–494). Springer.
  • Shernoff, D. J., Tonks, S. M., & Anderson, B. (2014b). The impact of the learning environment on student engagement in high school classrooms. In M. Csikszentmihalyi (Ed.), Engaging youth in schools: Evidence-based models to guide future innovations. Teachers College record (1970) (Vol. 116, pp. 166–177). SAGE. https://doi.org/10.1177/016146811411601315
  • Sun, Z., Lin, C. H., Wu, M., Zhou, J., & Luo, L. (2018). A tale of two communication tools: Discussion-forum and mobile instant-messaging apps in collaborative learning. British Journal of Educational Technology, 49(2), 248–261. https://doi.org/10.1111/bjet.12571
  • TEQSA. (2020). Good Practice Note: Improving retention and completion of students in Australian higher education. Tertiary Education Quality and Standards Agency.
  • Thomas, L., & Heath, J. (2014). Institutional wide implementation of key advice for socially inclusive teaching in higher education. A practice report. The International Journal of the First Year in Higher Education, 5(1), 125–133. https://doi.org/10.5204/intjfyhe.v5i1.206
  • van der Meer, J., Stephens, S., & Pratt, K. (2018). First semester academic performance: The importance of early indicators of non-engagement. Student Success, 9(4), 1–12. https://doi.org/10.5204/ssj.v9i4.652
  • Zainuddin, Z., Chu, S. K. W., Shujahat, M., & Perera, C. J. (2020). The impact of gamification on learning and instruction: A systematic review of empirical evidence. Educational Research Review, 30, 100326–23. https://doi.org/10.1016/j.edurev.2020.100326
  • Zhoc, K. C. H., Webster, B. J., King, R. B., Li, J. C. H., & Chung, T. S. H. (2019). Higher education student engagement scale (HESES): Development and psychometric evidence. Research in higher education, 60(2), 219–244. https://doi.org/10.1007/s11162-018-9510-6