116
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Online learning, perceived difficulty and the role of feedback in COVID-19 times

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon show all

ABSTRACT

The closure of higher education institutions as a result of the COVID-19 pandemic has necessitated a sudden and unexpected transition from face-to-face to online teaching and learning. This paper draws on data from a broader study focusing on higher education students’ perceptions and experiences of online learning during the first lockdown in 2020. In total, 2,718 students from different Portuguese institutions participated in the study. Findings indicate that students who felt they had been provided with more feedback reported less difficulty in online teaching and learning. Students reported more perceived difficulty than perceived feedback in online teaching and learning. Findings point to the importance of feedback in fostering self-regulation of learning, thereby helping students better adapt to online teaching and learning.

Introduction

The COVID-19 pandemic entailed a sudden and compulsory shift from face-to-face to online teaching and learning. Higher education was no exception. International literature indicates the ways in which different institutions responded to the challenges and demands arising from the need to teach and learn in an online scenario (see, for instance, Flores and Gago Citation2020; Quezada, Talbot, and Quezada-Parker Citation2020).

On 13 March 2020, the Portuguese Ministry of Science, Technology and Higher Education (MCTES) announced the closure of all higher education institutions to mitigate the spread of SARS-CoV-2. All activities involving the presence of students were suspended on March 16. It was then stipulated that online teaching and learning should be promoted to maintain teacher and student interaction via digital tools (MCTES Citation2020).

Although efforts were made for teaching activities to continue in an online mode, a number of constraints emerged such as technical problems during lessons as well as difficulties in concentrating and dealing with internal and external distractions (Flores, Barros, et al. Citation2021). In addition, existing literature points to several challenges in dealing with teaching and learning online. Issues of anxiety and stress caused by changes in teaching methods (Son et al. Citation2020; Xavier et al. Citation2020) and the need for preparation and training to deal with learning platforms (Lim Citation2020) have been reported.

A key challenge facing both teachers and students concerns assessment in online teaching and learning. Difficulties involving failure to adapt to the new assessment formats (Gonzalez et al. Citation2020) and in assessing practical knowledge (OECD Citation2020) have been identified. In addition, there are perceptions of unfairness (Flores, Veiga Simão, et al. Citation2021) and difficulties relating to academic dishonesty (Guangul et al. Citation2020).

Overall, personal and institutional factors explain different processes of adaptation when it comes to students’ individual characteristics but there are also issues of support and guidance in the online environment (Flores, Veiga Simão, et al. Citation2021). One of the features reported in existing empirical research concerns the need to investigate feedback in remote assessment in higher education, with particular regard to its benefits and drawbacks (Hast Citation2021). Fuller et al. (Citation2020, 782) assert that COVID-19 entailed ‘a real opportunity to explore different assessment for learning (AfL) designs. As low stakes assessments focused on providing actionable feedback to learners, they can also be of significant value in generating data to inform faculty and curriculum planners’.

Existing research reports on the benefits of online feedback, recommending it in higher education situations where students may be learning remotely (Bridge and Appleyard Citation2005) and access to feedback is considered easy (Hast and Healy Citation2018) and an immediate process (Grieve, Padgett, and Moffitt Citation2016). For instance, Broadbent (Citation2017) concludes that online students used self-regulated learning strategies more often than blended learning students. Feedback is a key factor in self-regulated learning and in the assessment process in online environments through such means as online video tutoring, email guidance after class (Bao Citation2020) and audio feedback (Hast Citation2021). Students who have been provided with assessment feedback via digital recordings give positive perceptions about the detail, personalisation and usability of the feedback comments (Ryan, Henderson, and Phillips Citation2019).

However, other studies show that when feedback is perceived negatively by students, it may have adverse effects (Hattie and Timperley Citation2007; Hast Citation2021). In particular, online feedback is only possible when adequate technological accessibility is ensured and students are motivated to access such feedback and overcome difficulties in interpreting and implementing it (Hast Citation2021). Other studies show that students who failed their courses tended to interact online less frequently (Davies and Graff Citation2005) while students with lower grades access significantly fewer web pages, and those who just passed made least use of feedback (Harrison et al. Citation2013). Attention should be paid to the number of assessment tasks proposed to students, which can result in lower quality work on the part of teachers as well as lower quality feedback on students’ learning (Gusso et al. Citation2020).

If feedback is to be effective it must be timely and relevant (Ramsden Citation2003), prospective (Sendziuk Citation2010), suitable to the context and recognised by both students and teachers (Orsmond, Merry, and Reiling Citation2005). In an online context, the effectiveness of feedback may be related to time management in online learning (Rasiah, Kaur, and Guptan Citation2020) and to communication mechanisms (Hatziapostolou and Paraskakis Citation2010). Research points to the influence of feedback on students’ learning and achievement and to the promotion of students’ sense of performance-relevant information (Henderson, Molloy et al. Citation2019). According to Winstone and Boud, feedback influences students’ future work and learning strategies and should be performed in such a way that feedback provided following assessment will enable students to learn and know what they have to do in summative tasks, although it is acknowledged that feedback practices are not always implemented in higher education contexts. As such, analysing the extent to which feedback is associated with the way university students manage online learning along with their perceived difficulties seems relevant and timely in a context marked by a global and compulsory shift from face-to-face to online teaching and learning in higher education.

The study

This paper draws on data from a broader research project (Flores, Barros, et al. Citation2021; Flores, Veiga Simão, et al. Citation2021) focusing on higher education students’ perceptions and experiences of online teaching and learning during the first lockdown in 2020. In this paper the following research questions are addressed:

Q1. What kinds of difficulties did higher education students experience in the shift from face-to-face to online learning?

Q2. How did they perceive the feedback received during the first lockdown?

Q3. Was teacher feedback associated with how university students managed online learning and felt difficulty in this context?

Hypotheses

H1: The feedback university students received from their teachers has a direct effect on the difficulties they felt in online learning.

H2: The feedback university students received from their teachers has a direct effect on their management of online learning.

H3: The feedback university students received from their teachers has an indirect effect on the difficulties they felt through their management of online learning.

Participants

A total sample of 2,741 students from different cycles of study was gathered. For exploratory factor analysis a sample of 200 participants (61.9% female) was used, 64.6% of whom were aged between 20 and 25, 17.7% were aged under 20, 12.4% were between 26–30, 3.5% were between 36 and 40, and 1.8% were between the ages of 31 and 35. Most of the participants were masters’ students (61.7%), while 30.1% were undergraduates, and 6.2% were PhD candidates. The remaining 2,541 participants (66.2% female) took part in the study for confirmatory factor analysis and structural equation modelling. Most were masters’ students (49.4%), while 47.8% were undergraduate students, 1.8% were PhD candidates, and 1.0% were enrolled in a specialised programme.

Instruments

A self-report measure was used as it offered information about the subjective experiences of participants regarding their online learning. In addition to the students’ biographical data, an open-ended questionnaire was included, which addressed three issues, (1) perceptions and experiences about perceived difficulties in online teaching and learning (How did the students adapt to the closure of the university and what difficulties did they experience?); (2) perceptions and experiences about their management of online learning (What were the students experiences of online learning?) and, (3) perceptions and experiences about feedback received (What kind of feedback did the students receive and when did it occur?).

Perceived feedback in online learning

Perceived Feedback in Online Learning (PFOL) is a 6-item instrument created for this study that asks participants to indicate on a Likert-type scale from 1 (totally disagree) to 5 (totally agree) to what extent they agreed with a series of statements about feedback, considering their online learning experience (see for items). An exploratory factor analysis (EFA) of the PFOL instrument was also computed with FACTOR 10.8.02 (Lorenzo-Seva and Ferrando Citation2013) and IBM 26.0 with data from 200 participants to examine the instrument’s internal structure. shows the correlations between all variables and the descriptive statistics. All variables showed skewness values below 2 and kurtosis values below 5 as indicated in the literature (Bollen and Long Citation1993). The data was tested with the Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy and Bartlett’s Test of Sphericity for its underlying structure. The KMO measure presented a good score of .81, whereas the Bartlett Sphericity was χ2(15) = 336.4 (p < .001), showing that the variables were adequate for factor analyses. Multivariate normality is achieved if Mardia’s coefficient is lower than P (P + 2), where P is the number of observed variables. In this study, 6 observed variables were analysed with a Mardia’s coefficient for skewness of 8 < 6(6 + 2) = 48 and for kurtosis of 56 > 6(6 + 2) = 48 (Bollen and Long Citation1993). Hence, we used unweighted least squares as the method for factor extraction since it does not depend on distributional assumptions (Joreskog Citation1977). To extract the appropriate number of factors, various factor retention criteria were used, namely, Velicer’s MAP test and Horn’s Parallel analysis. These methods were used since they perform optimally in determining the number of factors to extract (Bandalos and Finney Citation2010). These statistical procedures were used to find an approximation of a simple interpretable structure. The analysis suggested a unidimensional factor model for the PFOL with good reliability scores as shown in the psychometric literature (Nunnally Citation1978). Moreover, the values of goodness-of-fit (GFI = .97) and residual statistics (RMSEA = .10) were also good as suggested by the literature (McDonald Citation1999; Nunnally Citation1978; Velicer Citation1976).

Table 1. Item descriptive statistics, exploratory factor analysis parameters, reliability and correlations of the variables studied.

With data from 2,541 participants, confirmatory factor analyses of the PFOL instrument were also conducted (IBM, SPSS AMOS 19.0) with estimation procedures of maximum likelihood, including fit indices such as chi-square, Root Mean Square Error of Approximation (RMSEA), Comparative Fit Indices (CFI), Incremental Fit Index (IFI) and Akaike Information Criterion (AIC). The CFI and IFI values close to 1 show a good statistical fit (Bentler Citation1990), while RMSEA indicates a good fit if equal to or less than .08 (Browne and Cudeck Citation1993). Regarding the AIC, the lower the value, the better the fit. Lastly, the SRMR must be close to zero for a good fit. For all the confirmatory factor analyses, we used maximum likelihood because we were working with a large sample size, which reduces concerns regarding multivariate non-normality (Hair et al. Citation2010). Furthermore, it is considered a robust estimator with regards to both normally distributed data, and any violations of normality assumptions (Bollen Citation1989; Diamantopoulos, Siguaw, and Siguaw Citation2000). Some Monte-Carlo experiments have concluded that there are no great differences in results from structural equation modelling analysis using the maximum likelihood estimator in studies with different sample sizes and different kurtosis and skewness levels (Reinartz, Haenlein, and Henseler Citation2009). The model retrieved from the exploratory factor analysis was tested with confirmatory factor analysis to confirm its initial structure and it presented good reference values, as indicated in the literature (Hooper, Coughlan, and Mullen Citation2008) [χ2 (6) = 106.43, p = .01, χ 2/df = 17.73, CFI = 0.97, IFI = 0.97, RMSEA = .08, LO = .06, HI = .09, AIC = 148.54]. The unstandardised path coefficients were significant at p < .05. From this analysis, the initial structure of the PFOL instrument was maintained.

Perceived difficulty with online learning

Perceived Difficulty with Online Learning (PDOL) is an 8-item instrument created for this study that asks participants to mark on a Likert-type scale from 1 (no difficulty) to 5 (a lot of difficulty) the degree of difficulty they felt in online learning in relation to a series of aspects, presented as items in below. An exploratory factor analysis of the PDOL instrument was computed with FACTOR 10.8.02 (Lorenzo-Seva and Ferrando Citation2013) and IBM 26.0, using data from 200 participants to examine the instrument’s structure. shows the correlations between all variables and the descriptive statistics. The same procedures applied to the previous instrument were adopted for this one. Analysis suggested a unidimensional factor model for the PDOL with good reliability scores in accordance with the psychometric literature (Nunnally Citation1978). Moreover, the values of goodness-of-fit (GFI = .96) and residual statistics (RMSEA = .11) were also good as suggested in the literature (McDonald Citation1999; Nunnally Citation1978; Velicer Citation1976).

Confirmatory factor analyses of the PDOL were computed using data from 2,541 participants (IBM, SPSS AMOS 19.0) with estimation procedures of maximum likelihood, namely the above-mentioned fit indices. The model retrieved from the EFA was tested with confirmatory factor analysis to confirm its initial structure and it presented good reference values, in accordance with the literature (Hooper, Coughlan, and Mullen Citation2008) [χ2 (16) = 323.87, p = .01, χ 2/df = 20.24, CFI = 0.95, IFI = 0.95, RMSEA = .08, LO = .07, HI = .09, AIC = 379.87]. The standardised path coefficients were significant at p < .05. From this analysis, the initial structure of the PDOL instrument was maintained.

Managing online learning in higher education

Managing Online Learning in Higher Education (MOLHE) is a 7-item instrument created for this study that asks participants to indicate on a Likert-type scale from 1 (totally agree) to 5 (totally disagree) how they managed online learning with respect to a number of aspects, presented as items in below. An exploratory factor analysis of the MOLHE instrument was computed again using FACTOR 10.8.02 (Lorenzo-Seva and Ferrando Citation2013) and IBM 26.0 with the data from 200 participants to examine the instrument’s structure. shows the correlations between all variables and the descriptive statistics. The same procedures applied to the previous two instruments were also adopted for this one. Analysis suggested a unidimensional factor model for the MOLHE with good reliability scores in accordance with the psychometric literature (Nunnally Citation1978). Moreover, the values of goodness-of-fit (GFI = .99) and residual statistics (RMSEA = .06) were also good as suggested in the literature (McDonald Citation1999; Nunnally Citation1978; Velicer Citation1976).

Confirmatory factor analyses of the MOLHE were computed with data from 2,541 participants (IBM, SPSS AMOS 19.0) with estimation procedures of maximum likelihood, namely the above-mentioned fit indices. The model retrieved from the EFA was tested to confirm its initial structure and it presented good reference values, in line with the literature (Hooper, Coughlan, and Mullen Citation2008) [χ2 (12) = 105.77, p = .01, χ 2/df = 8.81, CFI = 0.92, IFI = 0.92, RMSEA = .05, LO = .04, HI = .06, AIC = 137.77]. The unstandardised path coefficients were significant at p < .05. From this analysis, the initial structure of the MOLHE instrument was maintained.

Procedures

A link to the survey was created through Qualtrics and sent out to the participants via their respective students’ unions. Students from all subject areas (including Social Sciences, Engineering and Technology, and Health Sciences) and from all cycles of study (undergraduate, masters and PhD) were invited. The questionnaire was administered between June 12 and 12 August 2020. The instruments were delivered online and completed individually by each participant. This procedure was followed by all the participants. Each instrument took students approximately 5 to 10 minutes to complete.

The research project was carried out according to the ethical principles of international research in education, namely data confidentiality, informed consent, voluntary participation, and the use of data collected only for research purposes. The project was approved by the Ethics Committee for Research in Social and Human Sciences at the University of Minho (Refª. CEICSH 057/2020). Participants were informed about the goals of the project prior to giving their consent. The link to complete the questionnaire and the research protocol were sent to each of the participants all of whom confirmed their voluntary informed consent to participate in the study.

Data analysis

Data collected through the open-ended questions were analysed using content analysis of emergent categories, based on the semantic criterion in order ‘to make inferences by systematically and objectively identifying the specific characteristics of a message’ (Esteves Citation2006, 108). To ensure accuracy of the analysis, ‘verification’ strategies (Creswell Citation1998) were implemented, namely member checking (Doyle Citation2007) by the researchers and co-authors in order to triangulate data interpretation. The categories were obtained through an iterative process which included separate coding, comparing results and reaching agreement. Joint discussions and analyses of emerging themes and subthemes were held with the research team so as to ensure relevance, triangulation and trustworthiness of the findings. Due to space constraints, only the most recurrent themes, including quotes from the participants, are presented to complement the quantitative data.

As for the quantitative data, Pearson correlations with IBM SPSS 26 were computed before performing structural equation modelling to investigate the relationship between higher education students’ teacher feedback, students’ management of online learning and their perceived difficulty with distance learning. Statistical significance of the regression coefficients was assessed with IBM SPSS AMOS 26 after estimating the parameters through maximum likelihood method. The variables were tested for normality by examining univariate and multivariate skewness and kurtosis. X2 tests were used to assess statistical significance of effects (Marôco Citation2010). Effects were considered to be significant at p < 0.05. Effect sizes for Cohen’s d are also presented. Cohen’s d of 0.2, 0.5, 0.8 are considered small, medium and large effect sizes, respectively (Cohen Citation1988).

Results

In this section, qualitative data referring to the first and second research questions are presented.

Perceived difficulties

From the qualitative analysis of the students’ responses, difficulties concerning distance education emerged, namely difficulties in concentrating, heavy workload, lack of resources, time management, lack of coping skills, emotional challenges and lack of motivation.

Above all others, the principal difficulty experienced by students was lack of concentration. The characteristics of online learning itself led to a lack of concentration as students had to share spaces with other family members in order to attend classes, and were seated for many hours, surrounded by excessive distractions, which consequently reduced the effectiveness of the learning process.

I feel that we managed to overcome the constraints of the situation, although some information is always lost because maintaining concentration in classes via zoom is more difficult.

(S.45)

Distractions are constant (not only from the cell phone, but also the neighbours, construction work in the street, family members/animals coming in and making noise) (…) it is very difficult to maintain concentration during 3-hour classes. (even with a break)

(S.748)

There were also difficulties linked to the heavy workload with implications for students’ well-being and for their academic performance, as they felt that the effort expended to deal with such a workload was not rewarded.

Many teachers when trying to avoid distance tests, ended up choosing to compensate with assignments and multiple mini-tests that left us with a huge workload and, if it is not easy to deal with the situation because we have had no time to internalise what is going on, with such a heavy workload it only makes the situation worse for us students.

(S.104)

… since the university closed, the professors have started to demand a lot in terms of work, not only individually, but also in groups, and I feel that I spend a lot of time doing it. This overload of work and lack of time to ‘switch off’ a little from classes often makes me feel tired, anxious and unmotivated.

(S.434)

Difficulties relating to the lack of resources at different levels were also identified. Students mention lacking the economic means to buy the necessary technological resources, but also point out the lack of support given when they requested these same necessary resources. These kinds of difficulties experienced by students were considered obstacles to the quality of their learning.

I lack technological resources and I have a bad family environment. (S.331)

Online learning was something that was unfortunately necessary. However, it was extremely ineffective. Lack of study materials and the lack of minimally decent resources for conducting lessons were the main recurrent problems during distance learning. (S.1704)

Time management was reported as being one of the biggest problems that students faced in distance learning. They spoke of the difficulties in managing study time and by being at home having to separate leisure time from work. The excessive workload and the fact that some of them are students and workers are also aspects that made time management a difficulty.

The biggest difficulty is managing time and being aware that it is necessary to study since we are at home.

(S.399)

It was a very difficult adaptation process since being at home 24/7, it was much more difficult for me to manage the time I had to carry out certain activities even though I always followed the normal class schedule. I feel that my time was not used so much.

(S.2692)

For me, above all, it has been difficult to manage time. When classes were in person, I felt I could manage time to study or do work and time to relax in a healthy way.

(S.434)

Most of the students were not able to deal with online learning, leading to physical and emotional problems. Students found it difficult to handle the sudden shift to online learning. Their perceptions reveal difficulties in management and in their adaptation to online learning, especially due to isolation, lack of socialisation and changing spaces, among others. As a result, some problems related to physical and emotional issues emerged. Students reported feelings of tiredness triggered by the number of hours sitting in front of the computer and also by the levels of anxiety, stress and uncertainty.

My level of anxiety and stress has increased considerably since we’ve been taking online classes, as teachers are demanding much more than they did in face-to-face classes and are scheduling too much work, with it not being easy to manage and reconcile everything from home.

(S.3)

I felt completely powerless faced with the COVID-19 situation and with the University closed, online classes, learning that was of no benefit and the stress associated with being confined in one place, and I ended up reacting badly, despising the work and hindering my academic progress.

(S.77)

According to the participants, lack of motivation was also another aspect triggered by the unexpected emergence of online learning, leading to a lack of interest among students in learning and in attending classes. Some students also pointed out the injustices that arose in this transition process.

I felt a great lack of motivation. Much of my day was spent in front of a computer, attending lessons and doing the many tasks that were asked of us.

(S.513)

It was a semester of change with little adaptation or understanding on the part of the faculty. There was a lot of injustice that led to even greater demotivation.

(S.1013)

The experience of feedback

Students who reported having received feedback during distance learning, essentially mention getting positive and constructive feedback in order to improve their performance and consequently their learning. Feedback helped them to overcome difficulties, self-regulate their learning, reflect on their performance and improve their grades. When asked about at what moments feedback was given, students indicated that it was chiefly after submission of an assignment/task but also when feedback was requested (including requests for clarification and queries) as well as following summative assessment and while students were working on assignments/reports/theses, and throughout the semester. Students received feedback from teachers when they had completed a task and/or throughout the process to improve learning.

Help with carrying out assignments, mentioning what we could do better. It also explained at what point we were in our work.

(S.20)

I received feedback throughout the completion of assignments, in order to improve my production as well as improve my final grade.

(S.511)

Every lesson we had to give a short presentation where we explained how our work had progressed during the week and the teacher would immediately give his feedback and immediately say what improvements needed to be made.

(S.1682)

During individual assignments, it was possible to ask the teacher for a first review with the aim of improving work and learning. In addition to this, we also received immediate feedback from group/individual work presentations through the discussion of themes and the information presented, which allowed us to develop the points for the final written assignment.

(S.195)

The qualitative data also show that feedback was given in synchronous sessions, through email and institutional and non-institutional platforms.

I always received feedback through different platforms (Zoom, Skype), which was very important for my development.

(S.235)

Whenever necessary, the teachers get in touch via the institutional email. (S.1306)

Our teachers mentioned, during the classes, what we need to do to improve and by email they also gave their opinions as well as on the blackboard online platform regarding answers to questionnaires and the work carried out. (S.1806)

Some students also showed some dissatisfaction with the lack of support from teachers and their institution. This lack of support led to greater difficulties in learning, particularly the lack of monitoring of the process as well as excessive tiredness and lack of motivation.

The fact that I reacted badly to the closure of the university is based largely on the lack of support I felt from teachers, which often left me stressed and extremely tired …

(S.1082)

Because although I continue to carry out the work, I felt a bit abandoned by the University at the beginning (in relation to my specific course for which the rules were not clearly defined) … I positively emphasise the intervention of my supervisor for his care in helping us to finish the academic year in the best way possible.

(S.205)

In certain subjects, it was difficult to obtain feedback from teachers; they often did not have well-defined guidelines and it was difficult to clarify doubts.

(S.2536)

As we did practical exercises the teachers gave feedback. However, not all did. Sometimes we did tasks for which we never got any correction.

(S.2115)

Teachers often did not send us feedback/clarification of doubts in good time, which resulted in a feeling of insecurity about the work I was doing, not knowing if I was doing what was asked and also not knowing how I was doing regarding my ongoing assessment.

(S.513)

Quantitative data

Quantitative data, descriptive statistics and Pearson correlations between the variables concerning the research questions, are presented in . Higher means in terms of difficulties felt by the students related to difficulties in concentrating, time and task management. Lack of support on the part of teachers and on the part of the institution also have average values higher than 2.5. The means obtained in the perception students had about having received feedback are also high as are items associated with being comfortable with online teaching and learning and with having the necessary resources to adapt well to the new environment.

Concerning the specific hypotheses of the third research question, quantitative data are presented. To test the relationship between university students’ perceived feedback and their perceived difficulty through managing online learning, structural equation modelling was used (). The adjusted model (χ2 [179] = 2117.43, p < .01, χ 2/df = 11.82, CFI = 0.93, GFI = 0.92, IFI = 0.93, RMSEA = .06, LO = .06, HI = .07, SRMR = .05, AIC = 2221.43) presented 50% of the variance relating to perceived difficulty. All paths were statistically significant, revealing that students’ perceived feedback had a direct effect on the difficulties they felt (β = −.05; LO= −.09; HI= −.01; Cohen’s d = .93) and how they managed online learning (β = .45; LO = .39; HI = .46; Cohen’s d = 2.00), thus confirming hypotheses 1 and 2, respectively. In fact, students’ perceived feedback had a negative direct effect on the difficulties they reported, whereas it had a positive direct effect on how they managed online learning. Moreover, the model also shows a statistically significant indirect effect between students’ perceived feedback and their difficulties, through the way they managed online learning (β = −.29; LO= −.32; HI= −.26). Specifically, students’ perceived feedback had a negative indirect effect on their difficulties, through the way they managed online learning. The indirect effect was stronger than the direct effect, thus revealing the importance of investigating how students managed online learning, which explained the relationship between perceived feedback and difficulties.

Figure 1. The relationship between perceived feedback and difficulties through higher education students’ management of online learning.

Figure 1. The relationship between perceived feedback and difficulties through higher education students’ management of online learning.

Discussion and conclusion

The study points to clear connections between university students’ perceptions about feedback, their perceived difficulties, and their capacity to manage online learning during the first lockdown in 2020. Results confirmed the first and second hypotheses of the third research question: the feedback university students received from their teachers has a direct effect on the difficulties they felt in online learning and the feedback university students received from their teachers has a direct effect on their management of online learning.

In fact, students’ perceived feedback had a negative direct effect on the difficulties they reported, whereas it had a positive direct effect on how they managed online learning. The feedback received from their teachers has a direct effect on the difficulties they felt with online learning. Furthermore, the feedback university students received from their teachers has a direct effect on their management of online learning.

Findings from this study indicate that higher education students who perceived that they received more feedback report less difficulty in online learning during lockdown. Furthermore, students’ perceived feedback predicts their perceived difficulty in online teaching and learning. However, students reported more perceived difficulty than perceived feedback in online teaching and learning. These findings suggest that if students are provided with feedback, self-regulation of learning may be enhanced and, as a result, better adaptation to online teaching and learning. Previous literature suggests that feedback influences students’ future work and their learning strategies (Winstone and Boud) and that in an online environment students tend to use more often self-regulated learning strategies than blended students (Broadbent Citation2017). These results may also be linked to the trend of assessment feedback being increasingly provided online (Evans Citation2013) meaning that students are more adapted to this process.

Qualitative data corroborate and explain quantitative data as they also show that the feedback received during lockdown was perceived as positive and constructive, enabling self-regulation of learning as well as the improvement of students’ performance throughout the process, as it encouraged the correction of detected flaws. This feedback, in addition to being given in synchronous sessions, was also provided by teachers via email and through different online platforms.

This study also shows that feedback pertaining to the learning process was a greater indicator of students’ perceived feedback in online teaching and learning during lockdown. Students react positively to feedback and to issues of accessibility and emotionality which have an impact on their future learning, as well as aspects related to communication based on the kind of language they are used to (Moffitt, Padgett, and Grieve Citation2020). Additionally, availability in online instructor-student interactions is associated with greater student learning satisfaction and higher levels of engagement with content (Richardson and Swan Citation2003). Yet, other studies suggest that students consider online feedback as less personal than offline paper-based handwritten feedback (Parkin et al. Citation2012).

This study also shows that the feedback university students received from their teachers has an indirect effect on the difficulties they felt through their management of online learning. It should be noted that this effect is mainly negative, which highlights the importance of studying how students managed online learning, in particular the connections between the difficulties they experienced and their feedback experiences.

The difficulty pertaining to managing time was also a greater indicator of students’ perceived difficulty in online teaching and learning during the first lockdown. Students who adapted negatively to the closure of their university also point to the lack of support from their teachers and their institution, as well as a lack of adequate equipment and difficulties in concentrating, time management, responding to teachers’ assignments and complying with all the tasks required of them, including following online teaching (see Flores, Barros, et al. Citation2021). Students’ responses to open questions also corroborate the quantitative data. Students perceived difficulties with online teaching and learning mainly in regard to lack of concentration, lack of support from teachers and institution, heavy workload, time management and lack of resources.

These findings are also consistent with earlier research which indicates that students have concerns regarding time management in online learning (Rasiah, Kaur, and Guptan Citation2020) often attributed to the number of assessment tasks, which can result in lower quality work on the part of teachers as well as lower quality feedback on students’ learning (Gusso et al. Citation2020). However, other empirical studies report that students manage study time effectively and do not face any difficulty in online learning (Adnan and Anwar Citation2020). Hast (Citation2021) also highlights the importance of students having the appropriate technological access, as well as motivation to access their online feedback. The study by Shetty et al. (Citation2020) suggests that students experience many challenges during online learning such as lack of face-to-face interactions, lack of socialisation, distraction by social media as well as technology related issues. Moreover, existing literature indicates that students who are motivated are more likely to engage in challenging activities, become actively involved in learning activities, perform better and be more determined to succeed in the face of challenging circumstances (Armstrong-Mensah et al. Citation2020).

Findings from this study also suggest that students who adapted well to online teaching and learning have higher means in items related to their perceptions about teachers giving them feedback during the process of doing a task, or after its completion, and about teachers being available to provide feedback when students asked them to. Qualitative data also demonstrate that the lack of feedback from teachers made it difficult to manage the transition to online teaching and learning. Online feedback entails some challenges but also opportunities, for example availability of digital platforms (Dunlap et al. Citation2016). It is important that faculty provide students with timely feedback, including online video tutoring and email guidance after class (Bao Citation2020).

Overall, this study adds to existing literature by providing a discussion about aspects related to the feedback process in higher education, specifically in online teaching. The COVID-19 pandemic and the reconfiguration of university educational provision during the two lockdowns has highlighted key issues such as effective online teaching, where the role of effective feedback practices is of particular importance. Feedback delivered online, similarly to feedback given face-to-face, should be effective, formative and provided clearly (Hodge and Chenelle Citation2018). In addition, university teachers should take advantage of the variety of feedback mechanisms available through online environments, ‘including peer and instructor feedback, as well as self-reflection’ (Hodge and Chenelle Citation2018, 195) by making use of greater diversity of feedback practices as described in the literature (e.g. Bao Citation2020; Hast Citation2021; Ryan and Henderson Citation2018).

The analysis resulting from the qualitative data shows how students experienced online education, in particular the difficulties encountered and how feedback was useful in managing these difficulties. The overall findings from this study point to the need to develop feedback strategies aimed at addressing the difficulties experienced and the way students manage online learning. In addition to contextual and institutional factors, future research should take into account students’ personal characteristics and the impact they have on how students manage the entire process. This study suggests that it is important to devote time and resources for timely and adequate feedback in higher education. This encompasses, amongst other features, the provision of professional development opportunities for staff focusing not only on its conceptual, methodological and ethical dimensions but also on shared practices and tools with a particular focus on the learning arising from the pandemic experience. It is also important to provide students with opportunities to learn from feedback and to expand their abilities to provide feedback to their peers.

Feedback is a complex process influenced by the ecology of practices, individual factors and contextual constraints (Henderson, Ryan et al. Citation2019). As such, it should be explored more than ever since the pandemic has entailed constraints, limitations and impositions that will influence both the learning process and the assessment process, with implications for the ways in which feedback is put into practice. In particular, it would be important to explore further the kinds of techniques and formats of feedback that are being used in the post-covid time as hybrid environments for teaching and learning in higher education are more and more prevalent.

Ethical statement

The research project was carried out according to the ethical principles of international educational research, namely data confidentiality, informed consent, voluntary participation, and the use of data collected only for research purposes.

Acknowledgments

This work was financially supported by Portuguese national funds through the FCT (Foundation for Science and Technology) within the framework of the CIEC (Research Centre on Child Studies of the University of Minho) projects under the references UIDB/00317/2020 and UIDP/00317/2020.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Adnan, M., and K. Anwar. 2020. “Online Learning Amid the COVID-19 Pandemic: Students’ Perspectives.” Journal of Pedagogical Sociology & Psychology 2 (1): 45–51. https://doi.org/10.33902/JPSP.2020261309.
  • Armstrong-Mensah, E., K. Ramsey-White, B. Yankey, and S. Self-Brown. 2020. “COVID-19 and Distance Learning: Effects on Georgia State University School of Public Health Students.” Frontiers in Public Health 8 (576227). https://doi.org/10.3389/fpubh.2020.576227.
  • Bandalos, D. L., and S. J. Finney. 2010. “Factor Analysis: Exploratory and Confirmatory.” In The reviewer’s Guide to Quantitative Methods in the Social Sciences, edited by G. Hancock and R. Muller, 98–122. London: Routledge.
  • Bao, W. 2020. “COVID 19 and Online Teaching in Higher Education. A Case Study of Peking University.” Human Behavior and Emerging Technologies 2:113–115. https://doi.org/10.1002/hbe2.191.
  • Bentler, P. M. 1990. “Comparative Fit Indexes in Structural Models.” Psychological Bulletin 107 (2): 238–246. https://doi.org/10.1037/0033-2909.107.2.238.
  • Bollen, K. A. 1989. Structural Equations with Latent Variables. New Jersey: John Wiley & Sons. https://doi.org/10.1002/9781118619179.
  • Bollen, K. A., and J. S. Long. 1993. Testing Structural Equations Models. Newbury Park, CA: Sage.
  • Bridge, P., and R. Appleyard. 2005. “System Failure: A Comparison of Electronic and Paper-Based Assignment Submission, Marking, and Feedback.” British Journal of Educational Technology 36 (4): 669–671. https://doi.org/10.1111/j.1467-8535.2005.00485.x.
  • Broadbent, J. 2017. “Comparing Online and Blended Learner’s Self-Regulated Learning Strategies and Academic Performance.” The Internet and Higher Education 33:24–32. https://doi.org/10.1016/j.iheduc.2017.01.004.
  • Browne, M. W., and R. Cudeck. 1993. “Alternative Ways of Assessing Model Fit.” In Testing Structural Equation Models, edited by Bollen and J. S. Long, 136–162. Beverly Hills, CA: Sage.
  • Cohen, J. 1988. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillsdale, NJ: Erlbaum.
  • Creswell, J. W. 1998. Qualitative Inquiry and Research design: Choosing Among Five Traditions. LA: Sage Publications, Inc.
  • Davies, J., and M. Graff. 2005. “Performance in E-Learning: Online Participation and Student Grades.” British Journal of Educational Technology 36 (4): 657–663. https://doi.org/10.1111/j.1467-8535.2005.00542.x.
  • Diamantopoulos, A., J. Siguaw, and J. A. Siguaw. 2000. Introducing LISREL: A Guide for the Uninitiated. CA: Sage Publications.
  • Doyle, S. 2007. “Member Checking with Older Women: A Framework for Negotiating Meaning.” Health Care for Women International 28 (10): 888–908. https://doi.org/10.1080/07399330701615325.
  • Dunlap, J., D. Bose, P. Lowenthal, C. York, M. Atkinson, and J. Murtagh. 2016. “What Sunshine is to Flowers: A Literature Review on the Use of Emoticons to Support Online Learning.” In Emotions, Technology, Design, and Learning, edited by S. Y. Tettegah and M. Gartemeier, 163–182. San Diego: Elsevier. https://doi.org/10.1016/B978-0-12-801856-9.00008-6.
  • Esteves, M. 2006. “Análise de Conteúdo.” In Fazer Investigação. Contributos para a elaboração de dissertação e teses, edited by L. Lima and J. A. Pacheco, 105–126. Porto: Porto Editora.
  • Evans, C. 2013. “Making Sense of Assessment Feedback in Higher Education.” Review of Educational Research 83 (1): 70–120. https://doi.org/10.3102/0034654312474350
  • Flores, M. A., A. Barros, A. M. V. Simão, D. Pereira, P. Flores, E. Fernandes, L. Costa, and P. C. Ferreira. 2021. “Portuguese Higher Education students’ Adaptation to Online Teaching and Learning in Times of the COVID-19 Pandemic: Personal and Contextual Factors.” Higher Education 3 (6): 1–20. https://doi.org/10.1007/s10734-021-00748-x.
  • Flores, M. A., and M. Gago. 2020. “Teacher Education in Times of COVID-19 Pandemic in Portugal: National, Institutional and Pedagogical Responses.” Journal of Education for Teaching 46 (4): 507–516. https://doi.org/10.1080/02607476.2020.1799709.
  • Flores, M. A., A. M. Veiga Simão, A. Barros, P. Flores, D. Pereira, E. L. Fernandes, P. Costa Ferreira, and L. Costa. 2021. “Ensino e aprendizagem à distância em tempos de COVID-19: um estudo com alunos do Ensino Superior.” Revista Portuguesa de Pedagogia 55:55. https://doi.org/10.14195/1647-8614_55_1.
  • Fuller, R., V. Joynes, J. Cooper, K. Boursicot, and T. Roberts. 2020. “Could COVID-19 Be Our ‘There is No Alternative’ (TINA) Opportunity to Enhance Assessment?” Medical Teacher 42 (7): 781–786. https://doi.org/10.1080/0142159X.2020.1779206.
  • Gonzalez, T., M. A. de la Rubia, K. P. Hincz, M. Comas-Lopez, L. Subirats, S. Fort, and G. M. Sacha. 2020. “Influence of COVID-19 Confinement on Students’ Performance in Higher Education.” PLoS One 15 (10): e0239490. https://doi.org/10.1371/journal.pone.0239490.
  • Grieve, R., C. R. Padgett, and R. L. Moffitt. 2016. “Assignments 2.0: The Role of Social Presence and Computer Attitudes in Student Preferences for Online versus Offline Marking.” The Internet and Higher Education 28:8–16. https://doi.org/10.1016/j.iheduc.2015.08.002.
  • Guangul, F. M., A. H. Suhail, M. I. Khalit, and B. A. Khidhir. 2020. “Challenges of Remote Assessment in Higher Education in the Context of COVID-19: A Case Study of Middle East College.” Educational Assessment, Evaluation and Accountability 8 (4): 519–535. https://doi.org/10.1007/s11092-020-09340-w.
  • Gusso, H., A. Archer, F. Luiz, F. Sahão, G. Luca, M. H. O. Henklain, M. G. Panosso, N. Kiener, O. Beltramello, and V. M. Gonçalves. 2020. “Ensino Superior Em Tempos De Pandemia: Diretrizes à Gestão Universitária.” Educação & Sociedade 41:e238957. https://doi.org/10.1590/es.238957.
  • Hair, J. F., W. C. Black, B. J. Babin, and R. E. Anderson. 2010. Multivariate Data Analysis: A Global Perspective. Upper Saddle River: Pearson Education Inc.
  • Harrison, C. J., K. D. Könings, A. Molyneux, L. W. Schuwirth, V. Wass, and C. P. van der Vleuten. 2013. “Web-Based Feedback After Summative Assessment: How Do Students Engage?” Medical Education 47 (7): 734–744. https://doi.org/10.1111/medu.12209.
  • Hast, M. 2021. “Higher Education in Times of Covid-19: Giving Online Feedback Implementation Another Look.” Higher Education Studies 11 (1): 1. https://doi.org/10.5539/hes.v11n1p1.
  • Hast, M., and C. Healy. 2018. ““It’s Like Fifty-fifty”: Using the Student Voice Towards Enhancing Undergraduates’ Engagement with Online Feedback Provision.” Journal of Teaching and Learning with Technology 7 (1): 139–151. https://doi.org/10.14434/jotlt.v7i1.23806.
  • Hattie, J., and H. Timperley. 2007. “The Power of Feedback.” Review of Educational Research 77 (1): 81–112. https://doi.org/10.3102/003465430298487.
  • Hatziapostolou, T., and I. Paraskakis. 2010. “Enhancing the Impact of Formative Feedback on Student Learning Through an Online Feedback System.” Electronic Journal of E-Learning 8 (2): 111–122.
  • Henderson, M., E. Molloy, R. Ajjawi, and D. Boud. 2019. “Designing Feedback for Impact.” In The Impact of Feedback in Higher Education, edited by M. Henderson, R. Ajjawi, D. Boud, and E. Molloy, 267–285. London: Palgrave Macmillan.
  • Henderson, M., T. Ryan, and M. Phillips. 2019. “The Challenges of Feedback in Higher Education.” Assessment & Evaluation in Higher Education 44 (8): 1237–1252. https://doi.org/10.1080/02602938.2019.1599815.
  • Hodge, E., and S. Chenelle. 2018. “The Challenge of Providing High-Quality Feedback Online: Building a Culture of Continuous Improvement in an Online Course for Adult Learners.” Transformations: The Journal of Inclusive Scholarship and Pedagogy 28 (2): 195–201. https://doi.org/10.1353/tnf.2018.0013.
  • Hooper, D., J. Coughlan, and M. Mullen. 2008. “Structural Equation Modelling: Guidelines for Determining Model Fit.” Electronic Journal of Business Research Methods 6 (1): 53–60. https://doi.org/10.21427/D7CF7R.
  • Joreskog, K. G. 1977. “Factor Analysis by Least-Squares and Maximum-Likelihood Methods.” In Statistical Methods for Digital Computers, edited by K.A. Enslein, 125–153.
  • Lim, M. 2020. “Educating Despite the COVID-19 Outbreak: Lessons from Singapore. The World University Rankings”. https://www.timeshighereducation.com/blog/educating-despite-covid-19-outbreaklessons-singapore#%20.
  • Lorenzo-Seva, U., and P. J. Ferrando. 2013. “Factor 9.2: A Comprehensive Program for Fitting Exploratory and Semiconfirmatory Factor Analysis and IRT Models.” Applied Psychological Measurement 37 (6): 497–498. https://doi.org/10.1177/0146621613487794.
  • Marôco, J. 2010. Structural Equation Analyses: Theoretical Foundations, Software and Applications. Pêro Pinheiro: Report Number Lda. ISBN: 978-989-96763-1-2.
  • McDonald, R. P. 1999. Test Theory: A Unified Treatment. Mahwah, NJ: LEA.
  • MCTES. 2020. Nota de Esclarecimento do Gabinete do Ministro da Ciência, Tecnologia e Ensino Superior de 13 março. Lisboa: Ministério da Ciência, Tecnologia e Ensino Superior. Retrieved from https://www.sec-geral.mec.pt/noticia/nota-de-esclarecimento-do-gabinete-do-ministro-da-ciencia-tecnologia-e-ensino-superior.
  • Moffitt, R., C. Padgett, and R. Grieve. 2020. “Accessibility and Emotionality of Online Assessment feedback: Using Emoticons to Enhance Student Perceptions of Marker Competence and Warmth.” Computers & Education 143:1–11. https://doi.org/10.1016/j.compedu.2019.103654.
  • Nunnally, J. C. 1978. Psychometric Theory. 2nd ed. New York: McGraw-Hill.
  • OECD. 2020. Education Responses to COVID-19: Embracing Digital Learning and Online Collaboration. https://read.oecd-ilibrary.org/view/?ref=120_120544-8ksud7oaj2&title=Education_responses_to_Covid19_Embracing_digital_learning_and_online_collaboration.
  • Orsmond, P., S. Merry, and K. Reiling. 2005. “Biology students’ Utilization of tutors’ Formative Feedback: A Qualitative Interview Study.” Assessment & Evaluation in Higher Education 30 (4): 369–386. https://doi.org/10.1080/02602930500099177.
  • Parkin, H. J., S. Hepplestone, G. Holden, B. Irwin, and L. Thorpe. 2012. “A Role for Technology in Enhancing students’ Engagement with Feedback.” Assessment & Evaluation in Higher Education 37 (8): 963–973. https://doi.org/10.1080/02602938.2011.592934.
  • Quezada, R., C. Talbot, and K. Quezada-Parker. 2020. “From Bricks and Mortar to Remote Teaching: A Teacher Education Program’s Response to COVID-19.” Journal of Education for Teaching 46 (4): 472–483. https://doi.org/10.1080/02607476.2020.1801330.
  • Ramsden, P. 2003. Learning to Teach in Higher Education. London: Routledge Falmer.
  • Rasiah, R., H. Kaur, and V. Guptan. 2020. “Business Continuity Plan in the Higher Education Industry: University Students’ Perceptions of the Effectiveness of Academic Continuity Plans During COVID-19 Pandemic.” Applied System Innovation 3 (4): 51. https://doi.org/10.3390/asi3040051.
  • Reinartz, W., M. Haenlein, and J. Henseler. 2009. “An Empirical Comparison of the Efficacy of Covariance-Based and Variance-Based SEM.” International Journal of Research in Marketing 26 (4): 332–344. https://doi.org/10.1016/j.ijresmar.2009.08.001.
  • Richardson, J., and K. Swan. 2003. “Examining Social Presence in Online Courses in Relation to Students’ Perceived Learning and Satisfaction.” Online Learning 7:68–81. https://doi.org/10.24059/olj.v7i1.1864. 1
  • Ryan, T., and H. Henderson. 2018. “Feeling Feedback: Students’emotional Responses to Educator Feedback.” Assessment & Evaluation in Higher Education 43 (6): 880–892. https://doi.org/10.1080/02602938.2017.1416456.
  • Ryan, T., M. Henderson, and M. Phillips. 2019. “Feedback Modes Matter: Comparing Student Perceptions of Digital and Non‐Digital Feedback Modes in Higher Education.” Journal of Educational Technology 50 (3): 1507–1523. https://doi.org/10.1111/bjet.12749.
  • Sendziuk, P. 2010. “Sink or Swim? Improving Students Learning Through Feedback and Self-Assessment.” International Journal of Teaching and Learning in Higher Education 22 (3): 320–330. do i.https://www.isetl.org/ijtlhe/pdf/IJTL/HE800.pdf.
  • Shetty, S., C. Shilpa, D. Dey, and S. Kavya. 2020. “Academic Crisis During COVID 19: Online Classes, a Panacea for Imminent Doctors.” Indian Journal of Otolaryngology and Head and Neck Surgery. 74 (1): 45–49. https://doi.org/10.1007/s12070-020-02224-x.
  • Son, C., S. Hegde, A. Smith, X. Wang, and F. Sasangohar. 2020. “Effects of COVID-19 on College students’ Mental Health in the United States: Interview Survey Study.” Journal of Medical Internet Research 22 (9): e21279. https://doi.org/10.2196/21279.
  • Velicer, W. F. 1976. “Determining the Number of Components from the Matrix of Partial Correlations.” Psychometrika 41 (3): 321–327. https://doi.org/10.1007/BF02293557.
  • Xavier, B., A. P. Camarneiro, L. Loureiro, E. Menino, A. Cunha-Oliveira, and A. P. Monteiro. 2020. “Impacto da COVID-19 nas dinâmicas sociofamiliares e académicas dos estudantes de enfermagem em Portugal.” Revista de Enfermagem Referência 4: 1–9. https://doi.org/10.12707/RV20104.