563
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Cross-cultural assessment of the community of inquiry instrument: a comparison between UK and US students

Received 02 Oct 2022, Accepted 03 Jan 2024, Published online: 15 Jan 2024

ABSTRACT

The purpose of this paper is to conduct a cross-cultural assessment of the Community of Inquiry (CoI) instrument [Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3-4), 133–136], comparing its application between students in the UK and the US. Using data collected from 245 UK accounting students, exploratory factor analysis was performed to assess the factorial structure of the CoI instrument. The analysis identifies three factors, consistent with the three presences outlined in the CoI framework [Garrison, D. R., & Archer, W. (2000). A transactional perspective on teaching and learning: A framework for adult and higher education. Elsevier Science]. However, five items display inadequate factor loadings below 0.5. In a cross-cultural comparison with [Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3-4), 133–136] US study, distinct differences emerge. Notably, UK students exhibit higher mean scores for teaching, social, and cognitive presences compared to their US counterparts. In conclusion, it appears that cultures with high long-term orientation and low uncertainty avoidance could be more conducive to active engagement in the learning process and critical thinking, thus leading to high student perceptions of CoI.

Introduction

This study aims to answer the research question: how does the application of the Community of Inquiry (CoI) framework differ between accounting students in the UK and the US in the context of blended learning environments? The CoI framework was introduced in the US to understand key elements of online learning (Garrison & Archer, Citation2000). The framework is based on the concept of ‘community of inquiry’, first introduced by Lipman (Citation1991) as an ideal educational context that promotes critical thinking and deep learning. The CoI framework holds that online learning is a process that occurs within a community of inquiry (Lipman, Citation1991) and comprises three key elements: teaching, social, and cognitive presences. Teaching presence refers to the design, facilitation, and direction of the cognitive learning process (Anderson et al., Citation2001; Garrison & Arbaugh, Citation2007), social presence relates to students’ ability to project their characteristics into the community (Garrison & Archer, Citation2000; Swan & Ice, Citation2010), and cognitive presence refers to the extent to which participants can construct meaning through sustained communication (Garrison & Archer, Citation2000). A deep and meaningful educational experience occurs through the interplay of teaching, social, and cognitive presences within the community of inquiry (Garrison & Archer, Citation2000; Kozan & Richardson, Citation2014). The CoI framework has become a valuable tool in online learning, guiding educators to understand the key elements of online learning environments (Castellanos-Reyes, Citation2020; Fiock, Citation2020; Stenbom, Citation2018).

This study is important in the context of the educational transformations caused by the COVID-19 pandemic (Reyneke et al., Citation2021; Sangster et al., Citation2020). Inspired by the research agenda outlined by Sangster et al. (Citation2020), which calls for empirical studies into evolving teaching practices post-COVID-19, this study focuses on ‘building learning communities among students in blended and online environments’ and examining ‘students’ preferences for, and satisfaction with, different forms of blended and online learning’ (Sangster et al., Citation2020, p. 445). Indeed, the pandemic has resulted in a significant transition in the landscape of accounting education in the higher education sector (Reyneke et al., Citation2021; Sangster et al., Citation2020). Before the pandemic, accounting education was primarily delivered by traditional in-person lectures and tutorials, while post-pandemic accounting education will be characterised by a blended approach that combines in-person and online delivery methods using tools like Moodle, Teams, and Zoom (Malan, Citation2020; Russo et al., Citation2022). This transition has created a research gap examining online learning communities, collaborative inquiry, and the potential for developing new pedagogical approaches in blended and online learning environments. Addressing the research gap is essential for enhancing pedagogical strategies in accounting education. It also holds implications for international accounting educators, particularly in the context of cultural factors that may impact learning.

In answering the research question, this study conducts a cross-cultural assessment of the CoI instrumentFootnote1 (Arbaugh et al., Citation2008), comparing its application between students in the UK and the US. Data was collected from 245 accounting students at a UK university using Arbaugh et al.’s (Citation2008) CoI instrument. The study tests the factorial structure of the CoI instrument in the UK setting via exploratory factor analysis and engages in a cross-cultural comparison by contrasting the findings of the present study in the UK with those of Arbaugh et al. (Citation2008) in the US.

Contribution

The present study makes three contributions to the field of accounting education and blended learning. First, the study contributes to the blended learning literature by identifying concerns about the factorial structure of the CoI instrument in the UK accounting education context. The study shows that the CoI instrument does not factor out as cleanly as it did in the original US context, with five items in the CoI instrument showing low factor loadings below the conventional 0.5 threshold (Hair et al., Citation2010), thus bringing their applicability into question. The study provides evidence to support Stenbom’s (Citation2018) call for revising Arbaugh et al.’s (Citation2008) CoI instrument to enhance its psychometric properties. Furthermore, the study points to potential ambiguities in item phrasing and a likelihood of measuring constructs other than those initially intended, highlighting the need to consider cultural and context factors in the wider application of the CoI instrument.

Second, the study contributes to the literature by engaging in a comparative analysis of UK and US student perceptions of CoI by contrasting the results of the present study with those of Arbaugh et al. (Citation2008) in the US. This comparison provides insights into the differences between the two countries concerning student perceptions of CoI. The comparison findings suggest that UK students exhibit higher mean scores for teaching, social, and cognitive presences than US students. To better understand these differences, the study employs Hofstede’s (Citation2001) cultural dimensions, specifically long-term orientation and uncertainty avoidance, to provide an explanation. This cross-cultural comparison contributes to the literature by advancing our understanding of how cultural context could impact student perceptions of CoI.

Third, this study adds empirical evidence to the CoI literature by testing Garrison and Archer’s (Citation2000) CoI framework in the UK setting, where limited research has been conducted to examine student perceptions of CoI (Stenbom, Citation2018). The study responds to Sangster et al.’s (Citation2020) call for empirical research on building learning communities in blended and online environments, especially relevant in light of the recent changes brought about by the COVID-19 pandemic in accounting education (Malan, Citation2020; Reyneke et al., Citation2021; Russo et al., Citation2022).

The remainder of the paper is structured as follows. Section two reviews the literature on the CoI framework. Section three details the research methods used in the study. Research findings are presented in Section four. The final section discusses the implications of research findings, instructional recommendations, and future research directions.

Literature review

The community of inquiry (CoI) framework

The CoI framework, as introduced by Garrison and Archer (Citation2000), is a model for conceptualising the key elements of an online learning experience – see . The framework is grounded in the belief that collaborative and dialogic learning communities are essential for promoting deep and meaningful learning (Garrison & Archer, Citation2000). The framework comprises three key elements: teaching presence, social presence, and cognitive presence.

Figure 1. Garrison and Archer’s (Citation2000) community of inquiry framework.

Figure 1. Garrison and Archer’s (Citation2000) community of inquiry framework.

The first element ‘teaching presence’ is defined as the design, facilitation, and direction of the cognitive learning process (Garrison & Arbaugh, Citation2007; Garrison & Archer, Citation2000). The design aspect includes the selection, organisation, and presentation of course content (Garrison & Arbaugh, Citation2007). Facilitation, which is the responsibility of the instructor but can also be shared among some learners (Garrison & Archer, Citation2000), involves guiding the educational process. Furthermore, Anderson et al. (Citation2001) classify teaching presence into three subcategories: (i) instructional design and organisation, such as setting curriculum and designing methods; (ii) facilitating discourse, including setting course climate, acknowledging or reinforcing student contributions; and (iii) direct instruction, which includes summarising the discussion and presenting content.

Previous research on teaching presence has examined the impact of teaching presence on student perceptions of learning and satisfaction (Arbaugh, Citation2007; Caskurlu et al., Citation2020; Khalid & Quick, Citation2016; Shea et al., Citation2005). For example, Shea et al. (Citation2005) reported a strong, positive correlation between teaching presence and students’ perceived learning. Caskurlu et al. (Citation2020) found moderately strong positive relationships between teaching presence and perceived learning, as well as between teaching presence and satisfaction. However, not all studies have yielded similar results. For example, Zhan et al. (Citation2007) examined the relationship between teaching presence and student learning outcomes and found no correlation between student-perceived teaching presence and satisfaction with the learning experience. Despite the mixed results, the literature collectively supports that teaching presence is essential for online learning and can significantly impact students’ learning experiences.

The second element ‘social presence’ is students’ ability to present themselves socially and emotionally as ‘real people’ (Garrison & Archer, Citation2000, p. 94). Social presence is essential in supporting cognitive presence and promoting critical thinking among learners (Garrison & Archer, Citation2000). Swan and Ice (Citation2010) explain that social presence refers to students’ ability to feel emotionally connected with peers and exhibit their personality through communication and interactions. Richardson et al. (Citation2017) identify 15 different definitions of social presence, all of which share the common theme of perceiving others in an online environment. However, concern has been raised that, when discussing social presence, it can be challenging to distinguish between social presence, social interaction, intimacy, emotion, and connectedness (Lowenthal, Citation2010).

Previous studies on social presence have shown that students who experience high levels of social presence tend to report high levels of perceived learning and instructor satisfaction (Oh et al., Citation2018; Richardson & Swan, Citation2003; Swan & Shih, Citation2005). For example, Richardson and Swan (Citation2003) examined the relationship between social presence in online learning environments and student perceptions of learning and instructor satisfaction. They found that students who perceived high levels of social presence reported high levels of perceived learning and instructor satisfaction. Swan and Shih (Citation2005) compared students with different levels of social presence and found that the perceived presence of instructors had a more significant impact on student satisfaction than the presence of peers. Oh et al. (Citation2018) reviewed 152 studies and found evidence for the positive impact of immersion and context on social presence. They observed a ceiling effect: social presence increased between low and medium levels of interactivity but plateaued between medium and high levels. They recommended that future research take a holistic approach to social presence by considering boundary conditions like contextual and individual factors, as well as the dynamics between the conversation partners.

The third element ‘cognitive presence’ refers to students’ ability to construct meaning through sustained communication (Garrison & Archer, Citation2000). Garrison and Arbaugh (Citation2007) conceptualise cognitive presence as a cycle of practical inquiry, comprising four stages: (i) a triggering event, where a problem is identified for further inquiry; (ii) exploration, where students explore the problem; (iii) integration, where students derive meaning from ideas; and (iv) resolution, where students apply new skills and knowledge. Swan and Ice (Citation2010) describe cognitive presence as the degree to which students can build knowledge through reflection and discourse.

Previous research on cognitive presence has emphasised the impact of cognitive presence on critical thinking and academic success. Garrison and Archer (Citation2000) suggest that cognitive presence is an essential component of critical thinking, which is the ultimate goal of higher education (Puig et al., Citation2019). Furthermore, cognitive presence is considered a fundamental requirement for academic success (Garrison & Archer, Citation2000). Empirically, Akyol and Garrison (Citation2011) examined the creation of cognitive presence in blended learning communities and found a strong relationship between student perceptions of cognitive presence and their learning outcomes, as measured by their final grades. However, they acknowledged that results associated with the measures of cognitive presence are influenced by the design of the course, as well as the learning characteristics and experiences of the students.

Teaching, social, and cognitive presences intersect and interact, forming the foundation of a deep, meaningful learning experience (Garrison & Archer, Citation2000). Teaching presence, which includes the design, facilitation, and direction of cognitive and social processes, serves as a scaffold for maintaining the learning experience (Anderson et al., Citation2001). This orchestration of the learning experience is interwoven with social presence, recognised as the ability of learners to present themselves as ‘real people’ in a virtual environment (Garrison & Archer, Citation2000). The intersection of teaching and social presences creates a supportive, engaging environment that fosters the third presence, cognitive presence – the extent to which learners can construct meaning through sustained communication (Garrison et al., Citation2001).

Regarding the impact of the three presences on learning, the CoI framework suggests that the three presences work synergistically to enhance both individual and communal learning outcomes. Specifically, teaching presence plays an essential role in orchestrating the learning community, guiding individual learners throughout their learning journey, and helping them construct meaning from the course content (Garrison & Archer, Citation2000). Social presence, in turn, fosters a sense of community, mutual trust, and respect among learners. This environment boosts individual learner engagement and participation in the learning process (Richardson et al., Citation2017; Richardson & Swan, Citation2003), which is pivotal for a thriving learning community (Garrison & Archer, Citation2000). Cognitive presence facilitates individual learning, enabling learners to construct and validate understanding through reflection, dialogue, and interaction within the learning community. This translates to deep and meaningful individual learning experience where critical thinking is refined and new knowledge is acquired (Akyol & Garrison, Citation2011; Garrison et al., Citation2001). Therefore, the interaction of the three presences is essential for optimising both communal and individual learning experiences (Garrison & Archer, Citation2000; Shea et al., Citation2005).

The CoI framework has been widely applied in research on online learning (Castellanos-Reyes, Citation2020; Jan et al., Citation2019; Richardson et al., Citation2017; Stenbom, Citation2018; Swan & Ice, Citation2010). Many studies provided evidence in support of the three presences (Arbaugh et al., Citation2008; Díaz et al., Citation2010; Garrison & Arbaugh, Citation2007; Swan et al., Citation2009). Additionally, researchers have explored the relationships among the three elements. For example, Garrison et al. (Citation2010) and Gutiérrez-Santiuste et al. (Citation2015) reported that teaching presence predicts social and cognitive presences, while Lin et al. (Citation2015) found that cognitive presence positively affects teaching effectiveness.

The CoI framework, however, has also received criticism. For example, Rourke and Kanuka (Citation2009) questioned the level of support provided by the framework for deep and meaningful learning in online courses. In response, Akyol et al. (Citation2009) pointed out that the Col framework was designed as a model of the learning process that emphasises knowledge construction. Another critique is that the CoI framework is incomplete and would benefit from including additional elements. Suggested elements include emotional presence (Cleveland-Innes & Campbell, Citation2012), learner presence (Shea & Bidjerano, Citation2012), and autonomy presence (Lam, Citation2015); however, the validity of these elements remains to be established (Kozan & Caskurlu, Citation2018).

Gap that current study intends to fill

Accounting education research has seen a growing interest in exploring student perceptions and behaviour in blended and online learning environments. For example, Osgerby (Citation2013) investigated accounting students’ expectations, perceptions, and competence in a blended learning environment. Despite several challenges with communication and interaction, accounting students had a favourable view of the blended learning approach. Extending this, Malan (Citation2020) examined student engagement in an online accounting course and found that while students rated the course as cognitively engaging, student collaboration remained a persistent challenge. More recently, Russo et al. (Citation2022) investigated the transferable skills of accounting students in blended learning and found that a well-designed blended learning approach improved students’ writing skills.

A literature review examined the publications in five accounting education journals – Accounting Education, Advances in Accounting Education, Issues in Accounting Education, Journal of Accounting Education, and The Accounting Educators’ Journal – over the past five years. The review identified eight themes concerning student perceptions and behaviour in blended and online learning: academic misconduct (Golden & Kohlbeck, Citation2020; Reyneke et al., Citation2021), assessment and feedback (Helfaya, Citation2019; Mihret et al., Citation2017), learning modes (Miley & Read, Citation2019; Stice et al., Citation2020), student engagement (Malan, Citation2020; Peng & Abdullah, Citation2018), student perceptions (Fortin et al., Citation2019; White et al., Citation2021), student transferable skills (Russo et al., Citation2022), student performance (Cheng & Ding, Citation2021; Krasodomska & Godawska, Citation2021), and technology issues (Lento, Citation2017; Lento, Citation2018).

Despite the growing interest in student perceptions and behaviour in blended and online learning environments, there is a research gap in investigating student perceptions of CoI in the accounting education settings. This research gap is unexpected in light of the COVID-19 pandemic, which compelled higher education institutions to adopt a blended approach combining both in-person and online elements in accounting education (Malan, Citation2020; Sangster et al., Citation2020). Sangster et al. (Citation2020) thus call for empirical research to address the challenges posed by blended and online learning, including the formation of learning communities. In response to this call, the present study seeks to conduct a cross-cultural assessment of the CoI instrument (Arbaugh et al., Citation2008), comparing its application between students in the UK and the US.

The present study contributes to the CoI literature by applying Garrison and Archer’s (Citation2000) framework to the UK accounting education context. The scarcity of CoI-related research in the UK, as noted by Stenbom (Citation2018), highlights the significance of this study. Furthermore, the ongoing shift in accounting education brought about by the COVID-19 pandemic, which has led to a blended approach (Malan, Citation2020; Reyneke et al., Citation2021; Russo et al., Citation2022; Sangster et al., Citation2020), renders this study particularly relevant.

Furthermore, the study engages in a cross-cultural comparison by contrasting the findings of the present study in the UK with those of Arbaugh et al. (Citation2008) in the US. If differences are identified between the two studies, Hofstede’s (Citation2001) cultural dimension theory will be employed to explain the differences. Hofstede’s cultural dimension theory has been widely used to understand cultural differences between nations (Kirkman et al., Citation2006) and has previously been applied to cross-cultural comparisons in accounting education research (Driskill & Rankin, Citation2020). Hofstede’s (Citation2001) cultural dimension theory has five dimensions: individualism-collectivism, power distance, uncertainty avoidance, masculinity-femininity, and long-term versus short-term orientation. The dimensions offer a framework for analysing differences between UK and US students in this study. The next section describes the research methods used in the study.

Methods

Questionnaire

An online questionnaire was administered to collect data on student perceptions of CoI. The questionnaire consists of 34 items from Arbaugh et al.’s (Citation2008) CoI instrument (see the CoI instrument in the Appendix). The items are presented on a five-point Likert scale ranging from ‘strongly disagree’ to ‘strongly agreed’. The questionnaire also includes demographic items such as gender and age. A pre-test was conducted with a dozen students to identify and eliminate any biases or misunderstandings that may have existed in the questionnaire.

Questionnaire administration

The online questionnaire was administered to students enrolled in accounting programmes at a UK university in 2021 and 2022.Footnote2 The students were informed that their participation was entirely voluntary and that their responses would be kept anonymous and confidential. No personal information would be divulged to anybody outside the research team.

Respondents

In total, 245 usable responses were received. Amongst the respondents, 58.8% (n = 144) were female students and 41.2% (n = 101) were male students. The ages of the respondents ranged from 18 to 30, with an average of 20.7 (SD = 2.97). All the respondents were enrolled in accounting programmes.

Data analysis strategy

The data analysis proceeded in two steps. First, exploratory factor analysis (EFA) was conducted to assess the factorial structure of the CoI items. In the EFA, four issues were considered: the suitability of conducting EFA, the extraction method, the number of extracted factors, and the rotation method. Specifically, the suitability of conducting EFA was assessed using Bartlett’s test of Sphericity (Hair et al., Citation2010) and the Kaiser-Meyer-Olkin measure of sampling adequacy (Field, Citation2013). Principal components analysis was chosen as the extraction method as it allows for a comprehensive analysis of variance and extracts factors that explain the highest possible percentage of the variance in a dataset of the sampled variables (Tabachnik & Fidell, Citation2007; Thompson, Citation2004). Two approaches were used to determine the number of extracted factors: a rule of thumb to extract factors with eigenvalues greater than one (Kaiser, Citation1960) and the scree plot test (Cattell, Citation1966). Oblimin rotation was performed as the underlying factors were expected to be related (Field, Citation2013; Heckman & Annabi, Citation2005).

Second, this study conducted a comparative analysis of UK and US student perceptions of CoI, by contrasting the results of the present study in the UK with those of Arbaugh et al.’s (Citation2008) in the US. Similarities and differences between the two studies were compared in terms of data collection methods, data analysis methods, and research findings. The research findings compared included mean scores at the factor levels, reliability, and the total variance explained by factors. The comparison was summarised in a table, highlighting the similarities and differences between the two studies.

Table 1. Questionnaire items, means and standard deviations.

Findings

Assessment of the factorial structure of the CoI instrument

Exploratory factor analysis (EFA) was conducted to assess the factorial structure of the CoI items.Footnote3 The suitability of EFA was assessed using the Keyser-Meyer-Olkin measure of sampling adequacy, which yielded a value of 0.954. This result exceeds the conventional cut-off value of 0.5 (Kaiser, Citation1974), indicating that the sample is adequate for conducting EFA. Additionally, Bartlett’s test of sphericity was calculated to test the difference between a correlation matrix and an identity matrix (Field, Citation2013), yielding a statistically significant result (p < 0.01). This finding indicates that the correlations between variables differ significantly from zero, fulfilling the prerequisites for conducting EFA (Hair et al., Citation2010). The number of factors to be extracted was determined using both the eigenvalues-greater-than-one method (Kaiser, Citation1960) and the scree plot method (Cattell, Citation1966). Both methods identified three factors.

The EFA results identified a three-factor structure. The first factor comprises 13 items that pertain to teaching presence, the second factor includes six items that relate to social presence, and the third factor contains ten items that pertain to cognitive presence. The three identified factors are consistent with the three presences in the CoI framework (Arbaugh et al., Citation2008; Garrison & Archer, Citation2000). This finding highlights the significance of Garrison and Cleveland-Innes’s (Citation2005) CoI framework as a theoretical foundation for understanding online learning (Stenbom, Citation2018).

Factor loadings for each item are presented in . Factor loading is the correlation coefficient for the item and the underlying factor and measures the degree to which the item is associated with the factor (Hair et al., Citation2010). All the items were found to load exclusively on their expected factors, with no cross-loadings observed. Teaching, social, and cognitive presences accounted for 52.0%, 16.9%, and 5.4% of the total variance in the correlation matrix respectively.

Table 2. EFA results.

Notably, five items demonstrated factor loadings of less than 0.5, a recommended cut-off value by Hair et al. (Citation2010). The items include three social presence items (items 17, 18, and 21) and two cognitive presence items (items 26 and 32). The low factor loadings could be attributed to two reasons. First, it is possible that the phrasing of the items had subtle nuances that impacted their factor loadings. As Arbaugh et al. (Citation2008) acknowledge, some items in their CoI instrument may not factor out as cleanly as they should due to ambiguities in their wording. Second, it is plausible that the items may measure constructs other than the intended factors. For example, Item 17 ‘I feel comfortable conversing through the online medium’ could potentially measure the students’ familiarity with online communication rather than social presence, leading to the observed low factor loadings.

The low factor loadings observed in the five items presents issues concerning the applicability of the CoI instrument in the UK setting (DeVellis, Citation2017; Hair et al., Citation2010). As Borsboom et al. (Citation2004) suggest, including items with low factor loadings can introduce bias into the measurement instrument and subsequent data analysis. Stenbom’s (Citation2018) also highlights the need to revise Arbaugh et al.’s (Citation2008) CoI instrument to enhance its psychometric properties and ensure it better reflects the cultural context in which it is used. It is thus advisable that future research in the UK contemplate excluding these five items from the CoI instrument to ensure accurate measurement of teaching, social, and cognitive presences. The next section proceeds to compare UK and US students’ perceptions of CoI by contrasting the findings of the present study with those of Arbaugh et al.’s (Citation2008) study in the US.

A comparison between the present study and Arbaugh et al. (Citation2008)

compares the present study in the UK to Arbaugh et al.’s (Citation2008) study in the US. The table highlights the similarities and differences between the two studies. While similarities are evident in research methods and factorial structure, the differences are particularly noteworthy. As shown in the table, notable differences include the total variance explained by factors, the techniques used to assess construct validity, and most importantly, mean scores.

Table 3. A comparison between the current study and Arbaugh et al.’s (Citation2008) study.

The analysis of mean scores at both item and factor levels revealed a significant difference between student perceptions of CoI in the two studies. Specifically, the mean scores for the UK study ranged from 3.45–4.64; the mean scores for the US study ranged from 2.90–3.63. At the factor level, the UK study had a mean score of 4.44 for teaching presence, 3.71 for social presence, and 3.98 for cognitive presence; the US study had a mean score of 3.34 for teaching presence, 3.18 for social presence, and 3.31 for cognitive presence. These findings indicate that UK students hold more positive perceptions of CoI than US students.

This study explains the differences in student perceptions of CoI between UK and US students by referencing two of Hofstede’s (Citation2001) cultural dimensions that most relevant to the findings: (i) long-term orientation and (ii) uncertainty avoidance. Although Hofstede (Citation2001) identifies other dimensions, these two dimensions are emphasised within the scope of this study for their direct relevance to the observed differences in educational engagement and attitudes towards learning environments. First, long-term orientation, as defined by Hofstede (Citation2001), refers to the degree to which cultures promote delaying gratification or the material, social, and emotional needs of their members. According to Hofstede Insights (Citation2023), the UK has a higher long-term orientation score of 51, while the US has a lower score of 26. These scores indicate that the UK culture values future-oriented thinking and delayed gratification more than the US. From an educational perspective, this can have significant implications for how students approach their learning. For example, UK students could be more motivated to invest time and effort in achieving goals, such as attending classes, actively participating in discussions, and engaging with learning materials. This good level of engagement could foster greater critical thinking in their learning (Lv et al., Citation2022). Furthermore, the future-oriented perspective encourages students to be patient and willing to delay gratification (Cheng et al., Citation2011). The willingness to delay gratification could drive them to work hard and persist in their studies, thus leading to higher levels of critical thinking on their learning. In contrast, the lower long-term orientation score in the US may suggest that students are more focused on immediate satisfaction and present-focused, resulting in lower engagement in the learning process and lower scores in teaching, social, and cognitive presences.

Second, uncertainty avoidance refers to how uncomfortable people feel in novel and unusual situations (Hofstede et al., Citation2010). The UK scores relatively low on this dimension, with a score of 35, while the US scores relatively high, with a score of 46 (Hofstede Insights, Citation2023). These scores suggest that the UK may have a more flexible culture and be open to new ideas and ways of doing things than the US. From an educational perspective, this may have important implications. The lower level of uncertainty avoidance in the UK may indicate that students are more comfortable with ambiguity and change, making them more receptive to the collaborative and open-ended nature of the community of inquiry (Cleveland-Innes, Citation2019; Haynes, Citation2018). This encourages students to actively participate in the learning process, share their perspectives and ideas, and engage in discussions and debates with their peers, ultimately leading to higher scores in teaching, social, and cognitive presences.

Discussion and implications

This paper has conducted a cross-cultural assessment of the Community of Inquiry (CoI) instrument (Arbaugh et al., Citation2008), comparing its application between students in the UK and the US. The study draws on Garrison and Archer’s (Citation2000) CoI framework, collecting data from 245 UK accounting students using Arbaugh et al.’s (Citation2008) CoI instrument. The study examines the factorial structure of the CoI instrument in the UK context, and also engages in a cross-cultural comparison by contrasting the findings of the present study with those of Arbaugh et al. (Citation2008) in the US.

The EFA results identify three factors, aligning with the CoI framework set out by Garrison and Archer (Citation2000). The confirmation of the three factors highlights the importance of Garrison and Cleveland-Innes’s (Citation2005) CoI framework as a theoretical foundation for research on online and blended learning (Fiock, Citation2020; Swan, Citation2019).

However, the EFA shows that five items exhibit inadequate factor loadings of less than 0.5, indicating weak correlations with the intended factors of these items and limited explanatory ability (Borsboom et al., Citation2004; Hair et al., Citation2010). The findings echo the concerns presented by Stenbom (Citation2018) regarding the need to revise Arbaugh et al.’s (Citation2008) CoI instrument to enhance its psychometric properties and better reflect the cultural context where the instrument is used. The study further points to potential ambiguities in item phrasing and a likelihood of measuring constructs other than those initially intended, highlighting the need to consider cultural and context factors in the wider application of the CoI instrument.

Furthermore, this study engages in a cross-cultural comparison by contrasting the present study in the UK with Arbaugh et al.’s (Citation2008) study in the US. Differences identified between the two studies include mean scores, the total variance explained by factors, and the methods used to assess construct validity. In particular, UK students have higher mean scores for teaching, social, and cognitive presences than US students. The study explains the differences using two of Hofstede’s (Citation2001) cultural dimensions, namely long-term orientation and uncertainty avoidance. It is concluded that cultures with high long-term orientation and low uncertainty avoidance could be more conducive to active engagement in the learning process and critical thinking, thus leading to high student perceptions of CoI. Based on these research findings, two recommendations are made to accounting educators.

Promote engagement and long-term oriented learning

The findings indicate that UK learners, who have a cultural tendency toward future-oriented thinking, show more positive perceptions of CoI than US students. This suggests that promoting a long-term orientation in learning, such as emphasising the importance of delayed gratification and long-term achievement, can help increase student engagement and promote positive learning outcomes. Educators should incorporate strategies that nurture this orientation, such as setting long-term goals, emphasising the future benefits of current learning activities, and incorporating reflection on progress towards these goals in their teaching practice. For example, educators can incorporate project-based assignments that span over the duration of the course (Handrianto & Rahman, Citation2019; Ngereja et al., Citation2020) or use gamification elements that reward persistence and long-term engagement (Oliveira et al., Citation2023).

Encourage creativity, curiosity, and exploration

The findings indicate that UK students, with a lower score in uncertainty avoidance, exhibit higher scores in the CoI than US students. This suggests that fostering a learning environment that encourages creativity, curiosity, exploration, and a tolerance for ambiguity can help increase student engagement and promote positive learning outcomes. For example, educators could design learning experiences that include open-ended case studies and problem-solving tasks that are not confined to a singular ‘correct’ answer (Downing et al., Citation2011; Francis et al., Citation2009; Mora et al., Citation2017). Such an approach can stimulate creativity and exploration in students, as well as foster comfort with uncertainty and changes. This will, in turn, better prepare them to actively participate in a community of inquiry.

Three directions are suggested for future research. First, one limitation of this study is its focus on the UK and US contexts, which does not represent the full spectrum of cultural diversity. For example, even though the UK’s score of 35 on the uncertainty avoidance dimension is lower than the US’s score of 46 (Hofstede Insights, Citation2023), both are below the global average. This indicates that the differences observed might not be as marked compared to countries with more divergent scores. Therefore, to better understand how culture impacts student perceptions of CoI, future research could include a broader, globally comparative study. This could involve countries with distinct cultural characteristics, such as China, India, and Nigeria. Such an approach could uncover both universal and culturally specific elements of the CoI framework, providing a more comprehensive understanding of its application across diverse educational settings.

Second, the findings indicate that the CoI instrument may not translate seamlessly across different cultural settings. Future research could focus on refining and validating this instrument, or developing new instruments, to better accommodate different cultural contexts. This could involve item rephrasing or the development of entirely new items that more accurately capture the constructs of teaching, social, and cognitive presence in different cultural settings.

Third, future research could integrate additional demographic variables, such as gender, age, and educational background. One interesting area could be exploring how gender impacts student participation in the community of inquiry. Prior studies have identified gender differences in online learning experiences in other disciplines (Noroozi et al., Citation2022; Park & Kim, Citation2020), making it an interesting area of investigation in the context of accounting education.

Conclusion

In addressing the research question, ‘how does the application of the Community of Inquiry (CoI) framework differ between accounting students in the UK and the US in the context of blended learning environments?’, this study has found significant differences. Notably, UK students scored higher on average in teaching, social, and cognitive presences compared to their US counterparts. This suggests that cultures with high long-term orientation and low uncertainty avoidance could be more conducive to active engagement in the learning process and critical thinking, thus leading to high student perceptions of CoI.

This study contributes to existing literature in three ways. First, it contributes to the blended learning literature by identifying concerns about the factorial structure of the CoI instrument in the UK accounting education context. Second, it contributes to the literature by engaging in a comparative analysis of UK and US student perceptions of CoI by contrasting the results of the present study with those of Arbaugh et al. (Citation2008) in the US. Finally, the study adds empirical evidence to the CoI literature by testing Garrison and Archer’s (Citation2000) CoI framework in the UK setting, where limited research has been conducted to examine student perceptions of CoI (Stenbom, Citation2018).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 Although the CoI framework was proposed by Garrison and Archer (Citation2000), it was not until the development of the CoI instrument by Arbaugh et al. (Citation2008) that the framework could be effectively measured. The CoI instrument comprises 34 items and measures the three elements of teaching, social, and cognitive presences in the CoI framework. The CoI instrument was developed using data collected from 289 US business and education students and has since been widely used in research on online learning (Annand, Citation2011; Rubin et al., Citation2011).

2 All students were enrolled in accounting programmes designed to prepare them for a successful career in accountancy. The programmes are accredited by major accountancy bodies in the UK, such as ACCA, CIMA, CIPFA, ICAEW, and ICAS, and provide specialised training in financial and management accounting, taxation, and auditing. The programmes use a blended learning approach, combining traditional on-campus engagement, such as tutorials and workshops, with online learning via virtual learning environments like Moodle and Aula. These virtual learning environments provide students access to various learning resources, including pre-recorded lecture videos, readings, podcasts, discussion boards, interactive events, and messaging with lecturers and classmates.

3 The descriptive statistics of the CoI items are reported in , including means and standard deviations. The mean scores ranged from 3.45 for Item 15 to 4.64 for Item 4, with all the scores exceeding the median point of 3 on the Likert scale. This result indicates that the students have high levels of perceptions of CoI. Additionally, the standard deviations ranged from 0.69 for Item 32 to 1.22 for Item 15, indicating that there are similar levels of student perceptions for some items but more variations in student responses for others. Mean scores were also calculated at the factor level. The mean scores were 4.44 (SD = 0.70) for teaching presence, 3.71 (SD = 0.87) for social presence, and 3.98 (SD = 0.62) for cognitive presence. All the means exceeded the median point of 3 on the Likert scale, indicating high levels of student perceptions in all three presences. The standard deviations suggest a general agreement among student perceptions of the three presences at the factor level.

References

  • Akyol, Z., Arbaugh, J. B., Cleveland-Innes, M., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. (2009). A response to the review of the community of inquiry framework. International Journal of E-Learning & Distance Education/Revue internationale du elearning et la formation à distance, 23(2), 123–136.
  • Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42(2), 233–250. https://doi.org/10.1111/j.1467-8535.2009.01029.x
  • Anderson, T., Liam, R., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Online Learning, 5(2), 1–17.
  • Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning, 12(5), 40–56. https://doi.org/10.19173/irrodl.v12i5.924
  • Arbaugh, J. B. (2007). Introduction: Project management education: Emerging tools, techniques, and topics. Academy of Management Learning & Education, 6(4), 568–569. https://doi.org/10.5465/amle.2007.27694956
  • Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3-4), 133–136. https://doi.org/10.1016/j.iheduc.2008.06.003
  • Borsboom, D., Mellenbergh, G. J., & van Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061–1071. https://doi.org/10.1037/0033-295X.111.4.1061
  • Caskurlu, S., Maeda, Y., Richardson, J. C., & Lv, J. (2020). A meta-analysis addressing the relationship between teaching presence and students’ satisfaction and learning. Computers & Education, 157, 103966. https://doi.org/10.1016/j.compedu.2020.103966
  • Castellanos-Reyes, D. (2020). 20 years of the community of inquiry framework. TechTrends, 64(4), 557–560. https://doi.org/10.1007/s11528-020-00491-7
  • Cattell, R. B. (1966). The scree test for the number of factors. Multivariate Behavioral Research, 1(2), 245–276. https://doi.org/10.1207/s15327906mbr0102_10
  • Cheng, P., & Ding, R. (2021). The effect of online review exercises on student course engagement and learning performance: A case study of an introductory financial accounting course at an international joint venture university. Journal of Accounting Education, 54, 100699. https://doi.org/10.1016/j.jaccedu.2020.100699
  • Cheng, Y.-Y., Shein, P. P., & Chiou, W.-B. (2011). Escaping the impulse to immediate gratification: The prospect concept promotes a future-oriented mindset, prompting an inclination towards delayed gratification. British Journal of Social Psychology, 50(3), 479–484.
  • Cleveland-Innes, M. (2019). The community of inquiry theoretical framework: Designing collaborative online and blended learning. In H. Beetham & R. Sharpe (Eds.), Rethinking pedagogy for a digital age (pp. 85–102). Routledge.
  • Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. International Review of Research in Open and Distributed Learning, 13(4), 269–292.
  • DeVellis, R. F. (2017). Construct validity. In R. F. DeVellis (Ed.), Scale development: Theory and applications (4th ed, pp. 95–97). SAGE Publications, Inc.
  • Díaz, S. R., Swan, K., Ice, P., & Kupczynski, L. (2010). Student ratings of the importance of survey items, multiplicative factor analysis, and the validity of the community of inquiry survey. The Internet and Higher Education, 13(1-2), 22–30. https://doi.org/10.1016/j.iheduc.2009.11.004
  • Downing, K., Ning, F., & Shin, K. (2011). Impact of problem-based learning on student experience and metacognitive development. Multicultural Education & Technology Journal, 5(1), 55–69. https://doi.org/10.1108/17504971111121928
  • Driskill, T., & Rankin, R. (2020). Cross-cultural comparison of ethical reasoning of students in China and the United States. Accounting Education, 29(3), 291–304. https://doi.org/10.1080/09639284.2020.1760114
  • Field, A. (2013). Discovering statistics using IBM SPSS statistics. Sage.
  • Fiock, H. (2020). Designing a community of inquiry in online courses. The International Review of Research in Open and Distributed Learning, 21(1), 134–152. https://doi.org/10.19173/irrodl.v20i5.3985
  • Fortin, A., Viger, C., Deslandes, M., Callimaci, A., & Desforges, P. (2019). Accounting students’ choice of blended learning format and its impact on performance and satisfaction. Accounting Education, 28(4), 353–383. https://doi.org/10.1080/09639284.2019.1586553
  • Francis, C., King, J., Lieblein, G., Breland, T. A., Salomonsson, L., Sriskandarajah, N., Porter, P., & Wiedenhoeft, M. (2009). Open-ended cases in agroecology: Farming and food systems in the Nordic Region and the US Midwest. The Journal of Agricultural Education and Extension, 15(4), 385–400. https://doi.org/10.1080/13892240903309645
  • Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23. https://doi.org/10.1080/08923640109527071
  • Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1-2), 5–9. https://doi.org/10.1016/j.iheduc.2009.10.003
  • Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157–172. https://doi.org/10.1016/j.iheduc.2007.04.001
  • Garrison, D. R., & Archer, W. (2000). A transactional perspective on teaching and learning: A framework for adult and higher education. Elsevier Science.
  • Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133–148. https://doi.org/10.1207/s15389286ajde1903_2
  • Golden, J., & Kohlbeck, M. (2020). Addressing cheating when using test bank questions in online classes. Journal of Accounting Education, 52, 100671. https://doi.org/10.1016/j.jaccedu.2020.100671
  • Gutiérrez-Santiuste, E., Rodríguez-Sabiote, C., & Gallego-Arrufat, M. J. (2015). Cognitive presence through social and teaching presence in communities of inquiry: A correlational–predictive study. Australasian Journal of Educational Technology, 31(3), 73–89. https://doi.org/10.14742/ajet.1666
  • Hair, J. F., Jr., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis: A global perspective. Pearson Education.
  • Handrianto, C., & Rahman, M. A. (2019). Project based learning: A review of literature on its outcomes and implementation issues. LET: Linguistics, Literature and English Teaching Journal, 8(2), 110–129.
  • Haynes, F. (2018). Trust and the community of inquiry. Educational Philosophy and Theory, 50(2), 144–151. https://doi.org/10.1080/00131857.2016.1144169
  • Heckman, R., & Annabi, H. (2005). A content analytic comparison of learning processes in online and face-to-face case study discussions. Journal of Computer-Mediated Communication, 10(2), https://doi.org/10.1111/j.1083-6101.2005.tb00244.x
  • Helfaya, A. (2019). Assessing the use of computer-based assessment-feedback in teaching digital accountants. Accounting Education, 28(1), 69–99. https://doi.org/10.1080/09639284.2018.1501716
  • Hofstede, G. (2001). Culture’s consequences: Comparing values, behaviors, institutions, and organizations across nations. Sage Publications.
  • Hofstede, G., Hofstede, G. J., & Minkov, M. (2010). Cultures and organizations: Software of the mind (2nd ed.). McGraw-Hill.
  • Hofstede Insights. (2023). Country Comparison: The United Kingdom and The United States. Retrieved January 19, 2023, from https://www.hofstede-insights.com/country-comparison/the-uk,the-usa/.
  • Jan, S. K., Vlachopoulos, P., & Parsell, M. (2019). Social network analysis and learning communities in higher education online learning: A systematic literature review. Online Learning, 23(1), 249–265.
  • Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20(1), 141–151. https://doi.org/10.1177/001316446002000116
  • Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39(1), 31–36. https://doi.org/10.1007/BF02291575
  • Khalid, M. N., & Quick, D. (2016). Teaching presence influencing online students’ course satisfaction at an institution of higher education. International Education Studies, 9(3), 62–70. https://doi.org/10.5539/ies.v9n3p62
  • Kirkman, B. L., Lowe, K. B., & Gibson, C. B. (2006). A quarter century of culture’s consequences: A review of empirical research incorporating Hofstede’s cultural values framework. Journal of International Business Studies, 37(3), 285–320. https://doi.org/10.1057/palgrave.jibs.8400202
  • Kozan, K., & Caskurlu, S. (2018). On the N th presence for the community of inquiry framework. Computers & Education, 122, 104–118. https://doi.org/10.1016/j.compedu.2018.03.010
  • Kozan, K., & Richardson, J. C. (2014). Interrelationships between and among social, teaching, and cognitive presence. The Internet and Higher Education, 21, 68–73. https://doi.org/10.1016/j.iheduc.2013.10.007
  • Krasodomska, J., & Godawska, J. (2021). E-learning in accounting education: The influence of students’ characteristics on their engagement and performance. Accounting Education, 30(1), 22–41. https://doi.org/10.1080/09639284.2020.1867874
  • Lam, J. Y. (2015). Autonomy presence in the extended community of inquiry. International Journal of Continuing Education and Lifelong Learning, 8(1), 39–61.
  • Lento, C. (2017). Incorporating whiteboard voice-over video technology into the accounting curriculum. Issues in Accounting Education, 32(3), 153–168. https://doi.org/10.2308/iace-51584
  • Lento, C. (2018). Student usage of assessment-based and self-study online learning resources in introductory accounting. Issues in Accounting Education, 33(4), 13–31. https://doi.org/10.2308/iace-52252
  • Lin, S., Hung, T. C., & Lee, C. T. (2015). Revalidate forms of presence in training effectiveness: Mediating effect of self-efficacy. Journal of Educational Computing Research, 53(1), 32–54.
  • Lipman, B. L. (1991). How to decide how to decide how to  … : Modeling limited rationality. Econometrica, 59(4), 1105–1125. https://doi.org/10.2307/2938176
  • Lowenthal, P. R. (2010). Social presence. In Social computing: Concepts, methodologies, tools, and applications (pp. 129–136). IGI global.
  • Lv, S., Chen, C., Zheng, W., & Zhu, Y. (2022). The relationship between study engagement and critical thinking among higher vocational college students in China: A longitudinal study. Psychology Research and Behavior Management, 15, 2989–3002. https://doi.org/10.2147/PRBM.S386780
  • Malan, M. (2020). Engaging students in a fully online accounting degree: An action research study. Accounting Education, 29(4), 321–339. https://doi.org/10.1080/09639284.2020.1787855
  • Mihret, D. G., Abayadeera, N., Watty, K., & McKay, J. (2017). Teaching auditing using cases in an online learning environment: The role of ePortfolio assessment. Accounting Education, 26(4), 335–357.
  • Miley, F., & Read, A. (2019). Pragmatic postmodernism and engagement through the culture of continuous creativity. Accounting Education, 28(2), 172–194.
  • Mora, C. E., Díaz, B. A., Marrero, A. M. G., Gutiérrez, J. M., & Jones, B. D. (2017). Motivational factors to consider when introducing problem-based learning in engineering education courses. The International Journal of Engineering Education, 33(3), 1000–10017.
  • Ngereja, B., Hussein, B., & Andersen, B. (2020). Does project-based learning (PBL) promote student learning? A performance evaluation. Education Sciences, 10(11), 330. https://doi.org/10.3390/educsci10110330
  • Noroozi, O., Banihashem, S. K., Taghizadeh Kerman, N., Parvaneh Akhteh Khaneh, M., Babayi, M., Ashrafi, H., & Biemans, H. J. (2022). Gender differences in students’ argumentative essay writing, peer review performance and uptake in online learning environments. Interactive Learning Environments, https://doi.org/10.1080/10494820.2022.2034887.
  • Oh, C. S., Bailenson, J. N., & Welch, G. F. (2018). A systematic review of social presence: Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114.
  • Oliveira, W., Hamari, J., Shi, L., Toda, A. M., Rodrigues, L., Palomino, P. T., & Isotani, S. (2023). Tailored gamification in education: A literature review and future agenda. Education and Information Technologies, 28(1), 373–406.
  • Osgerby, J. (2013). Students’ perceptions of the introduction of a blended learning environment: An exploratory case study. Accounting Education, 22(1), 85–99. https://doi.org/10.1080/09639284.2012.729341
  • Park, C., & Kim, D. G. (2020). Exploring the roles of social presence and gender difference in online learning. Decision Sciences Journal of Innovative Education, 18(2), 291–312. https://doi.org/10.1111/dsji.12207
  • Peng, J., & Abdullah, I. (2018). Building a market simulation to teach business process analysis: Effects of realism on engaged learning. Accounting Education, 27(2), 208–222. https://doi.org/10.1080/09639284.2017.1407248
  • Puig, B., Blanco-Anaya, P., Bargiela, I. M., & Crujeiras-Pérez, B. (2019). A systematic review on critical thinking intervention studies in higher education across professional fields. Studies in Higher Education, 44(5), 860–869. https://doi.org/10.1080/03075079.2019.1586333
  • Reyneke, Y., Shuttleworth, C. C., & Visagie, R. G. (2021). Pivot to online in a post-COVID-19 world: Critically applying BSCS 5E to enhance plagiarism awareness of accounting students. Accounting Education, 30(1), 1–21. https://doi.org/10.1080/09639284.2020.1867875
  • Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68–88.
  • Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior, 71, 402–417. https://doi.org/10.1016/j.chb.2017.02.001
  • Rourke, L., & Kanuka, H. (2009). Learning in communities of inquiry: A review of the literature. The Journal of Distance Education, 23(1), 19–48.
  • Rubin, B., Fernandes, R., & Avgerinou, M. (2011). How the use of virtual learning environment tools affects the online learning experience? 6th International Conference in Open and Distance Learning, Greece.
  • Russo, A., Warren, L., Neri, L., Herdan, A., & Brickman, K. (2022). Enhancing accounting and finance students’ awareness of transferable skills in an integrated blended learning environment. Accounting Education, 31(1), 67–91. https://doi.org/10.1080/09639284.2021.1961087
  • Sangster, A., Stoner, G., & Flood, B. (2020). Insights into accounting education in a COVID-19 world. Accounting Education, 29(5), 431–562. https://doi.org/10.1080/09639284.2020.1808487
  • Shea, P., & Bidjerano, T. (2012). Learning presence as a moderator in the community of inquiry model. Computers & Education, 59(2), 316–326. https://doi.org/10.1016/j.compedu.2012.01.011
  • Shea, P., Li, C. S., Swan, K., & Pickett, A. (2005). Developing learning community in online asynchronous college courses: The role of teaching presence. Journal of Asynchronous Learning Networks, 9(4), 59–82.
  • Stenbom, S. (2018). A systematic review of the community of inquiry survey. The Internet and Higher Education, 39, 22–32. https://doi.org/10.1016/j.iheduc.2018.06.001
  • Stice, E. K., Stice, J. D., & Albrecht, C. (2020). Study choices by introductory accounting students: Those who study more do better and text readers outperform video watchers. Advances in Accounting Education: Teaching and Curriculum Innovations, 24, 3–29. https://doi.org/10.1108/S1085-462220200000024007
  • Swan, K. (2019). Research on online learning. Online Learning, 11(1), 55–59. https://doi.org/10.24059/olj.v11i1.1736
  • Swan, K., Garrison, D. R., & Richardson, J. C. (2009). A constructivist approach to online learning: The community of inquiry framework. In C. R. Payne (Ed.), Information technology and constructivism in higher education: Progressive learning frameworks (pp. 43–57). IGI global.
  • Swan, K., & Ice, P. (2010). The community of inquiry framework ten years later: Introduction to the special issue. The Internet and Higher Education, 13(1-2), 1–4. https://doi.org/10.1016/j.iheduc.2009.11.003
  • Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115–136.
  • Tabachnik, B. G., & Fidell, S. L. (2007). Discriminant analysis. Using Multivariate Statistics, 201(3), 377–438.
  • Thompson, B. (2004). Exploratory and confirmatory factor analysis: Understanding concepts and applications. American Psychological Association.
  • White, T., Brody, R. G., & Gupta, G. (2021). Who’s got next? An analysis of the inhibitors to mobile game adoption in an introductory accounting class. Advances in Accounting Education: Teaching and Curriculum Innovations, 25, 23–47. https://doi.org/10.1108/S1085-462220210000025002
  • Zhan, H., Demonstes, L. S., Willis, B., & Scarnati, B. (2007). An examination of teaching presence in relation to learning outcomes in fully online undergraduate courses. In R. Carlsen, K. McFerrin, J. Price, R. Weber, & D. A. Willis (Eds.), Proceedings of society for information technology & teacher education international conference 2007 (pp. 586–589). Association for the advancement of computing in education.

Appendix: Arbaugh et al.’s (Citation2008) Community of Inquiry Scale (34 Items)

Teaching presence

  1. The instructor clearly communicates important course topics.

  2. The instructor clearly communicates important course goals/objectives.

  3. The instructor provides clear instructions on how to participate in course learning activities.

  4. The instructor clearly communicates important due dates/time frames for learning activities.

  5. The instructor is helpful in identifying areas of agreement and disagreement on course topics that help me to learn.

  6. The instructor is helpful in guiding the class towards understanding course topics in a way that helps me clarify my thinking.

  7. The instructor helps to keep course participants engaged and participating in productive dialogue.

  8. The instructor helps keep the course participants on task in a way that helps me to learn.

  9. The instructor encourages course participants to explore new concepts in this course.

  10. Instructor actions reinforce the development of a sense of community among course participants.

  11. The instructor helps to focus discussion on relevant issues in a way that helps me to learn.

  12. The instructor provides feedback that helps me understand my strengths and weaknesses relative the course’s goals and objectives.

  13. The instructor provides feedback in a timely fashion.

Social presence

14.

Getting to know other course participants gives me a sense of belonging in the course.

15.

I am able to form distinct impressions of some course participants.

16.

Online or web-based communication is an excellent medium for social interaction.

17.

I feel comfortable conversing through the online medium.

18.

I feel comfortable participating in the course discussions.

19.

I feel comfortable interacting with other course participants.

20.

I feel comfortable disagreeing with other course participants while still maintaining a sense of trust.

21.

I feel that my point of view is acknowledged by other course participants.

22.

Online discussions help me to develop a sense of collaboration.

Cognitive presence

23.

Problems posed increase my interest in course issues.

24.

Course activities pique my curiosity.

25.

I feel motivated to explore content-related questions.

26.

I utilise a variety of information sources to explore problems posed in this course.

27.

Brainstorming and finding relevant information helps me resolve content-related questions.

28.

Online discussions are valuable in helping me appreciate different perspectives.

29.

Combining new information helps me answer questions raised in course activities.

30.

Learning activities help me construct explanations/solutions.

31.

Reflection on course content and discussions helps me understand fundamental concepts in this class.

32.

I can describe ways to test and apply the knowledge created in this course.

33.

I am developing solutions to course problems that can be applied in practice.

34.

I can apply the knowledge created in this course to my work or other non-class-related activities.