1,081
Views
2
CrossRef citations to date
0
Altmetric
Review Articles

Success factors of recently implemented eLearning methods at higher education institutions in Kuwait

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon

Abstract

This study determines the critical success factors for students and academic staff when applying and evaluating online delivery methods in colleges and universities in Kuwait. The recently implemented eLearning systems and methods in the country, due to the COVID-19 pandemic, are evaluated and the perception of the eLearning system is gauged. Targeted surveys are distributed to a representative sample of undergraduate engineering students and academic staff. The following critical success factors are considered: benefits of the eLearning system, educational system quality, information quality, instructor quality, learner quality, service quality and technical system quality. Results show that there is a correlation between the perceptions of students and academic staff, particularly regarding instructor quality, information quality and benefits of the eLearning system. Both groups of respondents agreed on the high importance of instructor quality and the low importance of benefits.

Introduction

The COVID-19 pandemic wrought havoc on many aspects of day-to-day life. Countries around the world ordered the closure of schools and universities for an indefinite period to contain the spread of the COVID-19 virus. A prolonged closure of academic institutions affects the progression and matriculation of students at all levels. Over the course of the outbreak, many institutions around the world have transitioned to online education to make sure that students continue their education while maintaining physical distancing as recommended by the World Health Organisation. Kuwait’s Ministry of Education has begun exploring the possibility of allowing institutions to introduce electronic course delivery methods. With blended learning being used extensively by foreign-language schools in the country, higher education institutions worked closely with their respective regulatory body to reach a solution to the delay in the progression of the Spring 2020 semester, which resulted in the transition to a fully online delivery method.

Basak et al. (Citation2016) specified many critical success factors to consider when evaluating the option of online learning. Alhabeeb and Rowley (Citation2017) stated that transitioning to online learning provides major opportunities to improve students’ learning experience, whereas Martínez et al. (Citation2019) claimed that it broadens the appeal of offered programmes. However, Palvia et al. (Citation2018) argued that several challenges have been identified with case studies made in other Gulf Cooperation Council nations. The applicability of these challenges in a local context needs to be evaluated and eventually overcome if online education is going to be seriously pursued.

With an increasing percentage of the population becoming more technology-literate, Abdulwahed et al. (Citation2015) stated that offering online education can modernise and enhance the learning process while increasing appeal, accessibility and convenience to students. Additionally, Robinson (Citation2017) concurred with embracing the wave of online education and confirmed that it will benefit academic institutions in the long run. Bae et al. (Citation2015) stated that the universal acceptance and inevitability of incorporating a form of education that relies heavily on online delivery methods into curricula and that current and novel forms of delivery must be continuously evaluated and adapted to suit the needs of a particular community of learners. Stone (Citation2019) suggested that creative ways of content delivery must be used for online classes to engage students in their learning in ways that are not possible with face-to-face learning. On the contrary, Bennani et al. (Citation2012) argued that traditional methods of student assessment must be scrapped in favour of alternative methods that provide a more comprehensive overview of a student’s performance in a course. Moreover, hands-on learning experiences (such as laboratory activities and project-based learning) must also be adapted to account for the lack of physical presence by utilising remotely controlled virtual laboratories or wholly simulated live laboratory sessions, which Dutta and Bhattacharjee (Citation2019) have shown to be preferred by students for their benefits in accessibility.

A case study performed in Kuwait has shown that university-level students are ready to embrace the change associated with migrating to online education if they are trained to use the educational platforms to their fullest extent. Nonetheless, Safar (Citation2012) reported that some students expressed concerns about the lack of an instructor’s presence that would motivate them to be fully engaged in their learning. This however would be overcome with time and changes in the perception of online learning, as well as employing new technologies to address the lack of physical presence by the instructor.

Considering the COVID-19 pandemic, studies were carried out regionally by Alkhalil et al. (Citation2021) and internationally by Coman et al. (Citation2020) and Shahzad et al. (Citation2020); these studies aimed to evaluate the effectiveness of using pre-existing eLearning platforms to deliver learning experiences remotely. The studies showed that, despite some hurdles, the eLearning experience was rich and beneficial to students as they continued their studies during the pandemic. However, most of the mentioned studies that were conducted over the course of the pandemic were student-centred and did not include the perspectives of academic staff.

Since students and academic staff are arguably the main stakeholders of the eLearning environment, the perspectives of both groups are important to consider. Especially when considering budget constraints and the priorities of private educational institutions, both perspectives should be analysed to set realistic priorities. The findings of this study are expected to be useful in enhancing the current state of preparation, knowledge transfer and evaluation pattern in eLearning education. As well, it is expected that the findings will inform educational institutions and policymakers to improve eLearning education. To achieve the expected results from the study, it is important to assess the perspectives of both students and academic staff on eLearning education (Selvaraj et al., Citation2021).

This study aims to identify the perspectives on eLearning among academic staff and undergraduate students in Kuwait. Furthermore, significant differences between the perspectives of the two groups of respondents are identified. Finally, a ranking of success factors contributing to successful eLearning in institutions of higher education in Kuwait is derived. Based on the described purposes, the following research questions are explored:

  1. What is the perspective of academic staff on eLearning in Kuwait?

  2. What is the perspective of undergraduate students on eLearning in Kuwait?

  3. Can significant differences be identified between the two groups of respondents?

  4. What is the ranking of success factors from the student perspective?

  5. What is the ranking of success factors from the academic staff perspective?

Methods

The following sections describe the variables, scales, data collection process and statistical analysis.

Based on an extensive and recent literature review, supplemented by consideration of expert opinion, Al-Fraihat et al. (Citation2020) identified 58 measurement items, to measure eLearning system success utilising a multidimensional model. Extensive empirical evidence for the relationships between the model constructs was provided.

During two focus group meetings of the authors of this study, each of these 58 measurement items were discussed regarding the (1) relevance for the context of eLearning in Kuwait; (2) clarity for the two groups of respondents. These discussions led to the list of questionnaire items for the student questionnaire (), which required minor adjustments for the academic staff questionnaire to ensure relevance. References of these items to previous studies are shown in Al-Fraihat et al. (Citation2020).

Table 1. Constructs and Questionnaire items

Schernhammer (Citation2004) showed that measuring subjective perceptions of respondents involves challenges, which can be diminished by utilising multiple-item scales. At the same time, studies by Ponzurick et al. (Citation2000) and Bergkvist and Rossiter (Citation2007) showed that constructs such as perceptions can be measured reliably by single-item scales. For the study here, priority was given to limiting the number of questionnaire items as much as possible since anecdotal evidence suggested a negative impact on response reliability when questionnaire items are numerous and apparently similar in content. This resulted in seven constructs: Benefits (B), Educational System Quality (ESQ), Information Quality (InfoQ), Instructor Quality (IQ), Learner Quality (LQ), Service Quality (SQ) and Technical System Quality (TSQ). The two constructs Information Quality (InfoQ) and Educational System Quality (ESQ) are measured based on single item measures and the remaining five constructs are based on multiple-item measures. All questionnaire items were rated on a five-point response scale from 1 (strongly disagree) to 5 (strongly agree). The student questionnaire was also translated to Arabic to allow students to choose between the English and the Arabic version with the aim to minimise the impact of students’ language proficiency.

Social media was utilised to disseminate the questionnaires as widely as possible with the aim to collect as many responses as possible within a period of one month. Social media is the preferred method of communication among young people in Kuwait (Alenezi & Brinthaupt, Citation2022). Since eLearning was adopted in the whole country when the COVID-19 pandemic started, all students in higher education had already been exposed to eLearning at the time of the survey. Therefore, the risk of collecting data from students without eLearning experience is close to zero. Data from academic staff was collected using email, to ensure that responses were from academic staff.

The introduction of the questionnaire clarified that the target audience was limited to students and academic staff respectively of educational institutions in Kuwait. In addition to the questionnaire items shown in , the following demographic data were collected: gender; age; experience with the eLearning system; number of online modules (courses/subjects) taken; level of study; field of study.

A total of 407 responses from the student group, and a total of 32 responses from the academic staff group, were valid for further analysis. Data related to one question (question 14) was removed from the Learner Quality construct to have a unified construct for both groups of respondents since respondents of the academic staff group did not measure this item. The demographic data of the two groups are summarised in .

Table 2. Demographic data

The student cohort is mostly homogenous, consisting primarily of individuals under the age of 30 who were educated in Kuwaiti high schools. Meanwhile, the academic staff cohort is mostly heterogeneous, consisting of educators from around the world and varying in educational background, age, cultural background, experience and academic rank.

For the descriptive statistics, and to answer research questions 1 and 2, the mean value, median value and standard deviation were computed for all variables. To answer research question 3, the Mann–Whitney U test has been applied since distinct groups of respondents (students and academic staff) were evaluating the same aspects, as explained by Cohen et al. (Citation2018). Since the test converts the scores to ranks, Mann and Whitney (Citation1947) state that the test does not require a normal distribution of scores and therefore does not require similar sample sizes. The level of significance, alpha, was set to 0.05 and the results are presented in the following results section. For research questions 4 and 5, the ranking is computed based on the mean values of each construct. Spearman correlation analysis has been applied to identify relationships between success factors and the experience of students and academic staff.

Discussion of results

Following widespread practice, Cronbach’s alpha has been computed for both groups of respondents to give an indication regarding the internal consistency of the multi-item scales (Cronbach, Citation1951). Values of smaller than 0.5 are considered to reflect unacceptable scales according to George and Mallery (Citation2003), which means that the scales for service quality (alpha = 0.5) and learner quality (alpha = 0.5) are marginally above unacceptable. All other values are above 0.5 and are shown in . However, it should be noted that problems regarding the meaning of Cronbach’s alpha have been reported by Sijtsma (Citation2008) and, therefore, these values should not be overemphasised.

Table 3. Cronbach's Alpha of multi-item scales

To answer research question 1 (What is the perspective of academic staff on eLearning in Kuwait?) and research question 2 (What is the perspective of undergraduate students on eLearning in Kuwait?), descriptive statistics have been performed. For the academic staff group, the highest mean value was found for the Service Quality (SQ) with 4.1 (SD = 0.7) and the lowest mean value for the Information Quality (InfoQ) with 3.4 (SD = 1.2) and Benefits (B) with 3.4 (SD = 1). This perspective reflects the high importance of professional IT service during the implementation of the eLearning system to minimise disruptions of the teaching and learning process. Since teaching and learning material was largely identical with the material prior to the pandemic and since teaching from home led to big challenges related to disruptions by family members (a large number of instructors had to supervise their own children parallel to online teaching), information quality and benefits of the eLearning system were considered rather low. Regarding the student group, the highest mean value was found for the Instructor Quality (IQ) with 3.6 (SD = 0.9) and the lowest mean value for the Information Quality (InfoQ) with 3.4 (SD = 1.2). As in teaching approaches that do not utilise eLearning, the instructor is the main point of contact for students and, therefore, is perceived as the most important factor of the eLearning system from the students’ perspective. The low importance of the information quality might also reflect that students did not perceive any issue with the teaching and learning material that was communicated through the eLearning system. It is worth mentioning that while the student group had a higher number of respondents, both groups displayed similar values of standard deviation, which indicates comparable consensus within each group ().

Table 4. Descriptive statistics

Based on the mean values, the academic staff group had a much-varied response to the survey questions as indicated by the range in the mean values. This can be attributed to the number of years of experience with eLearning, where the academic staff cohort had a significantly higher percentage of respondents with >1 year of experience (44%) compared to the student cohort (21%). Additionally, this can be attributed to the demographic makeup of the academic staff cohort, who are more diverse in educational background and experience than the student cohort.

Comparing the standard deviations of the responses, higher standard deviations are shown among responses in the student cohort, which can be attributed to the larger sample size compared to the academic staff cohort. Additionally, more consistent mean values across all factors are seen in the student cohort compared to the academic staff cohort. In the students’ cohort, all median values were greater than or equal to the mean value, while the academic staff cohort has an outlier (SQ) whose mean is greater than the median. This can be attributed to the level of seriousness shown while taking the survey, which is expected to be higher in the academic staff cohort.

Finally, according to Sedgwick and Greenwood (Citation2015) the Hawthorne effect may have contributed to the different variety of mean values since academic staff can be considered more mature than students, who may have been hesitant to choose extreme scores.

To answer research question 3 (Can significant difference be identified between the two groups of respondents?), the Mann–Whitney U (MWU) test has been applied.

Comparisons regarding two of the seven model factors show a statistically significant difference between the academic staff and the student group. The comparison regarding Service Quality (SQ) shows a median for academic staff and student groups of 4 and 3.7 respectively (U = 4011, N1 = 32, N2 = 407, p = 0.0003), whereas the comparison regarding Instructor Quality (IQ) shows a median for academic staff and student groups of 4 and 3.8 respectively (U = 4952, N1 = 32, N2 = 407, p = 0.0238). The comparisons of the remaining five constructs do not present a statistically significant difference between the two groups of respondents. This shows the difference in perception students and academic staff have of the Instructor Quality (IQ) and Service Quality (SQ) factors. For the Service Quality factor, an institution’s IT Department would be more inclined to offer faster and more comprehensive support to academic staff as it could in turn solve technical issues that students may face. As for the Instructor Quality factor, students consider their instructors as the main factor of their learning experience as they ‘set the tone’ for the duration of the course of study, whereas academic staff would consider themselves more as a part of an integrated system that shapes the students’ learning experience ().

Table 5. Academic staff versus Student perspective (Mann–Whitney U test)

To answer research question 4 (What is the ranking of success factors from the student perspective?) and research question 5 (What is the ranking of success factors from the academic staff perspective?), mean values of both groups have been ranked.

From the academic staff perspective, Service Quality (SQ) is on rank 1 (mean = 4.07) and Benefits of the eLearning system (B) on rank 7 (mean = 3.43). From the student perspective, Instructor Quality (IQ) is on rank 1 (mean = 3.63) and Information Quality (InfoQ) is on rank 7 (mean = 3.38).

A relationship between the student ranking and the academic staff ranking can be seen, where Instructor Quality (IQ) is ranked highly in both cohorts (first rank for students and second rank for academic staff). Information Quality (InfoQ) and Benefits (B) are shown to be ranked among the lowest for both cohorts (6th and 7th for academic staff, 7th and 6th for students respectively). The rankings make sense given that most respondents have little (<1 year) experience with using eLearning systems ().

Table 6. Ranking of success factors by academic staff versus students

Results of the Spearman correlation analysis related to the relationship between success factors and experience of academic staff and students reflect the following. For both groups, the correlation between System Quality and experience is virtually zero (academic staff: 0.05; students: −0.03). This may reflect that both groups did not experience other eLearning systems before. However, the highest correlation is shown for both groups between Learner Quality and Benefits (academic staff: 0.77; students: 0.85), which reflects that both groups realise the importance of the quality of learners regarding the perception of eLearning system benefits.

Finally, students do not perceive any relationship between experience and success factors, as reflected by the values close to zero. However, this is not the case for academic staff. It may reflect the higher maturity of academic staff compared with students regarding the level of knowledge with respect to eLearning Systems ( and ).

Table 7. Correlation of critical success factors and experience (academic staff)

Table 8. Correlation of critical success factors and experience (students)

Conclusions

Based on the results of this study, the following conclusions can be drawn.

Regarding research questions 1 and 2 (the perspective of academic staff and undergraduate students on eLearning in Kuwait), the results of the descriptive analysis show the paramount importance of IT Service Quality (SQ) from the academic staff perspective, and the importance of Instructor Quality (IQ) from the student perspective, to ensure a successful eLearning experience. Also, the Spearman correlation analysis shows that both groups had a similar perspective regarding the highest and lowest relationship between success factors and experience. However, knowledge and maturity of academic staff reflect stronger relationships between perceived success factors and experience. For students, virtually no relationship was identified because of students’ limited experience.

Confirming the previous interpretation of descriptive results, the Mann–Whitney U test shows only a significant difference between academic staff and students (research question 3) regarding the two success factors, Instructor Quality (IQ) and Service Quality (SQ). This may show the importance of professional development for IT staff and academic staff.

The rankings of the success factors (research questions 4 and 5) indicate that both cohorts, academic staff and students, perceive Instructor Quality (IQ) as an important success factor and Benefits (B) and Information Quality (InfoQ) as less important success factors. The perceived low importance of Benefits (B) can be explained due to several factors, most notably the outbreak of COVID-19, which forced higher education institutions in the country to embrace eLearning, when not many academic staff or students were ready for such a sudden implementation.

Acknowledgements

The authors wish to express their gratitude to Dr. Abdullah O. AlMughrabi for his role in securing the funding associated with this study.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The datasets generated during or analysed during the current study are not publicly available due to restrictions set by the funding agency but are available from the corresponding author on reasonable request.

Additional information

Funding

The project was funded by Kuwait Foundation for the Advancement of Sciences under project code: CORONA PROP 118.

References

  • Abdulwahed, M., Hasna, M.O. & Froyd, J.E., 2015, Advances in Engineering Education in the Middle East and North Africa: Current status, and future insights (New York, Springer).
  • Alenezi, W. & Brinthaupt, T.M., 2022, ‘The use of social media as a tool for learning: perspectives of students in the Faculty of Education at Kuwait University’, Contemporary Educational Technology, 14(1), ep340.
  • Al-Fraihat, D., Joy, M., Masa'deh, R. & Sinclair, J., 2020, ‘Evaluating eLearning systems success: an empirical study’, Computers in Human Behavior, 102, pp. 67–86.
  • Alhabeeb, A. & Rowley, J., 2017, ‘Critical success factors for eLearning in Saudi Arabian universities’, International Journal of Educational Management, 31(2), pp. 131–47.
  • Alkhalil, S., Manasrah, A., Dabbour, L., Bashayreh, E., Abdelhafez, E. & Rababa, E., 2021, ‘COVID-19 pandemic and the eLearning in higher institutions of education: Faculty of Engineering and Technology at Al-Zaytoonah University of Jordan as a case study’, Journal of Human Behavior in the Social Environment, 31(1–4), pp. 464–75.
  • Bae, E., Prasad, P.W.C., Alsadoon, A. & Bajaj, K., 2015, ‘Framework to improve delivery methods in higher education through online learning’, paper presented at 2015 IEEE 7th International Conference on Engineering Education (ICEED), pp. 130–34 (New York, IEEE).
  • Basak, S., Wotto, M. and Bélanger, P., 2016, ‘A framework on the critical success factors of e-learning implementation in higher education: a review of the literature’, International Journal of Educational and Pedagogical Sciences, 10(7), pp. 2409–14.
  • Bennani, S., Idrissi, M.K., Fadouli, N., Yassine, B.T. & Ouguengay, Y.A., 2012, ‘Online Project based learning driven by competencies: a systematic strategy proposal for assessment’, in Proceedings of 2012 International Conference on Interactive Mobile and Computer Aided Learning (IMCL), 6–8 November, pp. 92–97 (New York, IEEE).
  • Bergkvist, L. & Rossiter, J., 2007, ‘The predictive validity of multiple-item versus single-item measures of the same constructs’, Journal of Marketing Research, 44(2), pp. 175–84.
  • Cohen, L., Manion, L. & Morrison, K., 2018, Research Methods in Education (London, Routledge).
  • Coman, C., Țîru, L., Meseșan-Schmitz, L., Stanciu, C. & Bularca, M., 2020, ‘Online teaching and learning in higher education during the coronavirus pandemic: students’ perspective’, Sustainability, 12(24), 10367.
  • Cronbach, L., 1951, ‘Coefficient alpha and the internal structure of tests’, Psychometrika, 16(3), pp. 297–334.
  • Dutta, S.J. & Bhattacharjee, R., 2019, ‘Integration of virtual laboratories: a step toward enhancing e-learning technology’, in 12CT 2019, 5th International Conference for Convergence in Technology, 29–31 March at Bombay, India, pp. 1–5 (New York, IEEE).
  • George, D. & Mallery, P., 2003, SPSS for Windows Step By Step: A simple guide and reference, fourth edition (Boston, Allyn & Bacon).
  • Mann, H. & Whitney, D., 1947, ‘On a test of whether one of two random variables is stochastically larger than the other’, The Annals of Mathematical Statistics, 18(1), pp. 50–60.
  • Martínez, P.J., Aguilar, F.J. & Ortiz, M., 2019, ‘Transitioning from face-to-face to blended and full online learning engineering master's program’ IEEE Transactions on Education, 63(1), pp. 2–9.
  • Palvia, S., Aeron, P., Gupta, P., Mahapatra, D., Parida, R., Rosner, R. & Sindhi, S., 2018. ‘Online education: worldwide status, challenges, trends, and implications’, Journal of Global Information Technology Management, 21(4), pp. 233–41.
  • Ponzurick, T., France, K. & Logar, C., 2000, Delivering graduate marketing education: an analysis of face-to-face versus distance education’, Journal of Marketing Education, 22(3), pp. 180–87.
  • Robinson, L., 2017, ‘Embracing online education: exploring options for success’, Journal of Marketing for Higher Education, 27(1), pp. 99–111.
  • Safar, A.H., 2012, ‘The students’ perspectives of online training at Kuwait University’, College Student Journal, 46(2), pp. 436–58.
  • Schernhammer, E., 2004, ‘Job stress and breast cancer risk: the nurses' health study’, American Journal of Epidemiology, 160(11), pp. 1079–86.
  • Sedgwick, P. & Greenwood, N., 2015, ‘Understanding the Hawthorne effect’, BMJ 2015;351:h4672.
  • Selvaraj, A., Radhin, V., Nithin KA, Benson, N. & Mathew, A.J., 2021, ‘Effect of pandemic based online education on teaching and learning system, International Journal of Educational Development, 85, pp. 102444.
  • Shahzad, A., Hassan, R., Aremu, A., Hussain, A. & Lodhi, R., 2020, ‘Effects of COVID-19 in eLearning on higher education institution students: the group comparison between male and female’, Quality & Quantity, 55, pp. 805–26.
  • Sijtsma, K., 2008, ‘On the use, the misuse, and the very limited usefulness of Cronbach’s Alpha’, Psychometrika, 74(1), pp. 107–20.
  • Stone, C., 2019, ‘Online learning in Australian higher education: opportunities, challenges and transformations’, Student Success, 10(2), pp.1–11.