437
Views
0
CrossRef citations to date
0
Altmetric
Articles

Measuring digital skills in community adult learning settings – implications for Australian policy development

ORCID Icon, , , &
Pages 23-44 | Received 02 Aug 2023, Accepted 14 Jan 2024, Published online: 26 Feb 2024

ABSTRACT

Social and economic participation is increasingly dependent on the proficient use of digital technologies in everyday life. Digital skills, alongside literacy and numeracy, are now foundational skills that all Australians should have. Yet, how policy makers, scholars, and adult educators can effectively identify people’s current and desired digital skills is not well established. Existing frameworks tend to be instrumentalist, listing generic abilities that all citizens need for success. Digital abilities, however, are context-specific and attempts to generalise need to be responsive to local circumstances. We present a theory-driven Digital Ability Self-Assessment Tool, developed for a government agency and piloted in a community organisation. The Tool takes a distinctive socio-material approach to overcome shortcomings of other digital ability assessment tools, which often focus on normative measures of skills attainment. Overall, we argue that authentic digital skills policy and programme development must account for the contexts in which digital technologies are used.

Introduction

Digital inclusion is increasingly critical to people’s capacities to realise their potential in all areas of life (Helsper, Citation2008; Ragnedda & Mutsvairo, Citation2018). It is closely related to other terms such as the ‘digital divide’ (van Dijk, Citation2020) and ‘digital inequality’ (Hargittai, Citation2021), which also draw attention to disparities in access to and use of digital technologies along the lines of race, geography, education, and wealth. In this paper, we focus on digital ability (comprising individuals’ skills, knowledge, and attitudes towards the internet and digital technologies) as a key enabler of and barrier to digital inclusion and, by extension, people’s social and economic participation. Digital ability has emerged as a key challenge of human development and adult education (Povede Villalba, Citation2015), with the recent and ongoing COVID-19 crisis bringing renewed attention to how essential it is for people’s social and economic well-being. In addition, the Australian Federal Government recognises digital skills, alongside literacy and numeracy, as foundational skills that all Australians should have (Jobs and Skills Australia, Citation2023). Accordingly, benchmarking and improving digital skills through community education initiatives have become core areas of concern for digital inclusion policy.

Over the past decade digital ability initiatives have emerged across the world, with examples including the Mozilla Foundation’s Digital Skills for Digital Librarians project in the USA (Fellows et al., Citation2018), the Stepping into Digital Life programme (de Reynal & Richtor, Citation2016) by the Digital Skills Observatory in Africa, and the Be Connected programme for over-50s in Australia (Australian Government, Citation2022). A challenge for these kinds of programmes is identifying participants’ existing knowledge and skills. As Covello and Lei (Citation2010) have argued, determining which digital skills are important, and how they can be identified and deployed in different local contexts, can be a barrier to developing frameworks and tools for tracking individual progress and evaluating how successful such interventions have been.

Digital abilities are context specific, making it difficult to generalise about a ‘baseline’ of skills that are required for successful participation in everyday life, which provides particular challenges for adult education. The types of digital skills that may be essential to a community of migrant job seekers, for example, will not necessarily be the same as those skills that will assist elderly patients to access health and social services. Given the range of skills and needs of community members, and the specificity and nuances of social and economic contexts, it is difficult to assess digital ability in ways that sufficiently acknowledge the needs and goals of all individuals.

To address this challenge, this paper presents a Digital Ability Self-Assessment Tool (DASAT) that aims to contextualise individual-level digital ability self-assessment. The DASAT builds on Eshet-Alkalai’s (Citation2004, Citation2012) digital literacy framework and is designed to enable individuals – with the assistance of a mentor – to identify their level of digital ability across different life spheres: digital friend/family, digital citizen, digital consumer, digital employee (Skov, Citation2016). We developed the Tool in collaboration with an Australian state government department responsible for deploying and evaluating digital ability programmes in communities, and we produced a preliminary report on the Tool’s development for the government department that commissioned it (Dezuanni, Burgess, Mitchell, Marshall, & Cross, Citation2018). The government team saw a specific need for a tool that would enable digital ability programme facilitators to (1) benchmark the digital skills of programme participants; (2) enable participants to identify areas of strength and weakness, and the skills of most relevance to their individual circumstances; and (3) help to evaluate if participants had improved their digital skills and achieved personal goals.

The DASAT, which draws on a socio-material approach to assessing digital ability rather than an instrumentalist ‘skills list’ approach, was piloted with constituents of a community organisation delivering a digital inclusion programme to people living with a physical disability. As part of an ongoing collaboration with the research team, this organisation’s interest in helping to test the Tool was to ensure it was inclusive and catered to diverse needs and abilities. Using the Tool with people living with a physical disability draws specific attention to issues of accessibility, mobility, and assistive technology, which have emerged as critical factors for digital inclusion, social participation, and individual agency in a digital world.

This paper addresses the research context for the Digital Ability Self-Assessment Tool as well as its research contributions. The paper introduces the concept of digital inclusion (of which digital ability is a core dimension) and explicates its links to social inclusion and community development. We survey relevant literature, focussing on digital literacies, skills, capabilities, and competencies, to explore what it means to be able to develop and assess digital abilities. We further illustrate a distinction between instrumental-technical and socio-material approaches to assessing digital skills through a description of several representative digital skills assessment tools. Following this, we present the DASAT, explain its theoretical underpinnings, and describe how it was piloted with end users. We emphasise how our Tool embodies the critical distinction we have made between technical-instrumental and socio-material approaches to understanding and assessing digital ability. We discuss the benefits and limitations of the Tool and opportunities for improvements to its design and implementation. We conclude by calling on policy makers and programme designers to recognise the importance of authentically situating digital skills development within contexts of socio-material practice.

Digital inclusion and digital abilities

The concept of ‘digital inclusion’ has emerged from legacy discourses about the ‘digital divide’, which originally focused on access to physical internet infrastructure in societies across the world (Livingstone & Helsper, Citation2007; Selwyn, Citation2004; Warschauer, Citation2002). As access issues have diminished for many people, other key factors such as affordability of access and people’s ability to use technology in everyday life have emerged as key challenges (Thomas et al., Citation2021). Since the early 2000s, research has shown that to fully assess and understand the digital divide, approaches must differentiate between individuals’ digital capabilities, or what Hargittai (Citation2002) referred to as the ‘second-level’ digital divide. The concepts of the first-, second-, and third-level digital divides highlight scholars’ progression from investigating people’s capacity to access the internet (first-level digital divide) and acquire internet skills (second-level divide), to understanding the disparate and inequitable outcomes of internet use (third-level digital divide; Scheeder et al., Citation2017). Thus, the more holistic concept of ‘digital inclusion’ describes nuances and gradations in digital participation in place of the more binary ‘digital divide’ (Livingstone & Helsper, Citation2007; Parsons & Hick, Citation2008; van Deursen & van Dijk, Citation2019).

It is well-accepted that digital exclusion is accompanied by, and further compounds, other forms of social disadvantage (Helsper, Citation2008). Studies of digital inclusion highlight a range of social and cultural factors underpinning people’s capacity to access, afford, and use digital technologies – these factors include low levels of income (Lee et al., Citation2016), low levels of education (Cruz-Jesus, Vicente, Bacao, & Oliveira, Citation2016), physical remoteness (Freeman & Park, Citation2015), un/under-employment (Pirzada & Khan, Citation2013), older age (Olphert et al., Citation2005), migrant and refugee status (Alam & Imran, Citation2015), and homelessness and/or poverty (Humphrey Citation2022). There are also known intersecting digital inclusion challenges between these cohorts and people living with disability (Ellis, Goggin, Haller, & Curtis, Citation2020; MacDonald & Clayton, Citation2013; Seale et al., Citation2010), such as the paradox that digital technology can be both enabling and disabling depending on one’s level of access and skills (Goggin, Citation2021). Accordingly, Selwyn and Facer (Citation2013) suggest that addressing digital exclusion demands complex sets of policy responses that go beyond hardware provision and support, and the simple identification of gaps to be bridged with training. Warschauer (Citation2002) approaches the significant challenge of the digital divide by comparing it to what he refers to as the ‘literacy divide’, suggesting that just as literacy should be considered as an aspect of social development, so too must digital abilities be co-constituted in socially situated activities. Our approach to understanding individuals’ digital ability is informed by these arguments and suggests that creating effective and supportive tools to measure and develop digital capabilities, and making those tools widely available to communities, will contribute to digital inclusion objectives.

Digital ability research

Over time, and with the evolution of digital technologies, digital ability has evolved from – and is often used interchangeably with – the concepts of information literacy (Jackman & Jones, Citation2002), eLiteracy (Martin, Citation2003), and digital literacy (Katz, Citation2005), along with digital competence (Calvani, Cartelli, Fini, & Ranieri, Citation2008; Ilomäki et al., Citation2016) and digital capacity (Collin, Notley, & Third, Citation2018). In particular, the concept of digital ability has its origins in digital literacy. Broadly speaking, digital literacy has been conceived in two ways (Lankshear & Knobel, Citation2008): (1) as an array of technical skills that are seen as necessary to qualify as ‘digitally literate’, and (2) as mastery of cognitive and socio-emotional aspects of working in a digital environment. The former emphasises micro-level capabilities to operate technology and perform tasks (for example, being able to open an internet browser). The latter, having been informed by broader concepts such as digital citizenship, emphasises macro-level capabilities to access and critically evaluate digital information for a purpose, such as judging the authenticity of a news site (Mossberger et al., Citation2007).

Digital ability may also be considered in socio-material terms (Dezuanni, Citation2015). The ‘material’ aspect of socio-materiality takes account of how technologies’ material affordances – that is, how they are designed and manufactured – enable and constrain people’s interaction with them (see Antonio & Tuffley, Citation2015). The ‘social’ aspect of socio-materiality draws attention to the social contexts in which technologies are used; this includes: the various motivations for using digital technologies; what these thechnologies enable people to achieve in their daily lives (or what they are required to achieve); and how digital technologies become entwined in users’ interactions with other people. Socio-cultural literacy theory has demonstrated that literacies are developed within the complex social practices of everyday life (Heath, Citation1983). In such contexts, instrumentalist accounts of literacy that over-emphasise alphabetic or standardised literacies in educational contexts often fail to meet the needs students have for life and work (Heath, Citation1983; Warschauer, Citation2002). Following this line of argument, we suggest that the development of digital ability in training or support contexts will rarely succeed if it over-emphasises decontextualised skills, particularly where they are applied to technologies and scenarios with which individuals have little familiarity. As Collin, Notley, and Third (Citation2018) suggest, it is increasingly important to see digital technologies as an aspect of everyday social life and ‘to mobilize material and symbolic resources in order to maximize benefits, opportunities, and aspirations afforded by the digital’ (p.26). Digital ability is, therefore, deeply entwined with successful social and economic participation and, as such, must be measured in ways that account for individuals’ circumstances, preferences, and ambitions.

In this socio-material paradigm, digital activities – and the capabilities required to undertake them – largely depend on the personal goals, attitudes, knowledges, opportunities, material access to resources, physical abilities, support and learning opportunities, and other life circumstances experienced by individuals. While there is no universally accepted definition of digital ability, we define it here as a variable range of capabilities which enable a person’s successful participation in specific social-material and technological contexts. This definition underpins the Digital Ability Self-Assessment Tool (DASAT) and emphasises that the types of digital literacies, skills, competencies, and capabilities that are required for adults to be ‘digitally able’ are contingent on the socio-material context in which they carry out digital activities, and the reasons they have for participating.

Frameworks and tools for assessing digital ability

A review of existing conceptual frameworks and assessment tools assists us to position our Tool along the spectrum from technical-instrumental to socio-material conceptions of digital ability tools. In the technical-instrumental paradigm, Katz (Citation2005) articulated one of the early popular frameworks for assessing digital literacy, encapsulating the general components of many other digital literacy models, with a focus on components (framed as actions) associated with information use, namely: define, access, evaluate, manage, integrate, and create. Another tool, the DiSTO QuestionnaireFootnote1 is a digital ability assessment tool that aligns with a ‘list-type’ paradigm that includes a tick-box Likert-scale survey for a global audience, with 157 wide-ranging questions covering a variety of digital skills in various aspects of life (e.g. work, study, leisure, etc.). This tool is comprehensive and broadly applicable, but acutely focused on micro-level digital skills which are detached from their application in real-life contexts. The widely used Internet Skills Survey (ISS) (van Deursen et al., Citation2016) was designed to capture a full range of Internet skills (from basic to advanced levels) – distinct from computer skills – across populations, based on a rigorous process of a literature review, cognitive interviews, pilot tests, and statistical testing of measures (e.g. validity, reliability). Developed in the UK and Netherlands, the ISS measures include a series of mostly ‘I can’ statements under the skill themes of operational, information navigation, social, creative, and mobile. While these measures span across life spheres such as families, friends, and work or consumption interests (Skov, Citation2016), and the ‘I can’ statements help users to reflect directly on their prior experiences to determine their understanding of existing skills (a contextualised approach), the survey’s overall aim of scalability prompted the authors to exclude a set of critical (literacy) skills ‘because they were shown to be individual context dependent and not easy to measure in general population survey research’ (van Deursen et al., Citation2016, p. 820).Footnote2 The UK’s Essential Digital Skills framework (Department for Education, United Kingdom, Citation2019) similarly employs ‘I can’ statements under several categories, but is underpinned by an apparent bias towards white-collar workers (with several items being situated in office contexts), thereby potentially excluding other diverse groups and their everyday interactions with digital technologies in the home or other workplaces.

Towards the socio-material end of the spectrum, the DigComp 2.1 framework (Carretero, Vuorikari, & Punie, Citation2017) – the third iteration of Ferrari’s (Citation2013) DigComp 1.0 – is described as a ‘Digital Competence Framework for Citizens’ within the European Union (EU). Aimed at developing and understanding digital competence as a comibination of information and data literacy, communication and collaboration, digital content creation, safety awareness, and problem-solving, the DigComp framework shifts emphasis from mere informational skills to a broader range of more advanced and contextualised activities, thereby representing a shift from instrumentalist to socio-culturally contextualised skills. Indeed, DigComp 2.1’s first competence (Information and data literacy) encompasses Katz’s (Citation2005) entire instrumental framework, with the remaining competences (2–5) emphasising that digitally able individuals must employ digital technologies in collaborative, creative, safe, and meaningful ways. Moreover, the DigComp 2.1 document provides contextualised examples of these competencies, as well as eight proficiency levels for each competence, from foundation to highly specialised skills (Carretero, Vuorikari, & Punie, Citation2017). Various assessment and self-assessment tools have been created based on DigComp, notably the Digital Economy and Society Index (DESI),Footnote3 which summarises indicators for Europe’s digital performance and tracks the progress of 45 EU countries. Also, the DigComp Into ActionFootnote4 guide provides case studies and tools to support implementation of DigComp in communities.

Despite its popularity, DigComp has drawn criticism for remaining ‘individual-centric and decontexualised’ (Airola, Rasi, & Outila, Citation2020, p. 259); that is, assessing digital ability in individuals alone ignores the role and importance of other actors and factors in the person’s life, which can have an enormous impact on whether and how digital skills can be acquired. For example, whether a senior person has support from family members to access and use the Internet has important ramifications for that indiviual’s level of digital inclusion. While our Tool also measures digital ability at the individual level, we provide extra resources for facilitators to help end users think about the community, cultural, and family norms and practices that may enable or prevent them from gaining access to some types of devices and content.

Finally, digital ability assessment tools that align with the socio-material paradigm of the above-mentioned frameworks are few but include the Northstar Information Literacy survey.Footnote5 This highly interactive multimedia resource aimed at jobseekers involves a minute-long video orientation to explain how the assessment works; presentation of three realistic scenarios (e.g. job search using online tools) featuring culturally diverse characters so that respondents ‘see themselves’ in the scenarios; and a detailed assessment showing ‘mastered skills’ and ‘skills for improvement’. The Northstar survey tool assesses the types of contextualised devices, software programmes, platforms, and websites job seekers are likely to encounter and use in finding and applying for jobs or education/training courses that will assist them to meet their specific needs (such as budget, travel time/distance, interests, and other commitments). Unlike other tools, the Northstar survey seamlessly brings together material and social aspects to provide respondents with a simulated experience of online job-seeking with comprehensive feedback at the end. While likely highly effective, such bespoke, digital, interactive tools may be beyond the financial and technical capability of many community-based organisations to provide in local adult education contexts. In contrast, the tool we propose is paper-based and therefore more easily rolled out.

In summary, we have outlined two broad approaches to defining and assessing digital ability. Although technical-instrumental approaches that emphasise essential and generic informational skills may be efficient and scalable to deliver, they remove assessment of digital ability from the context in which digital skills and knowledge are aquired and applied. On the other hand, although socio-material approaches that assess digital ability in terms of their capacity to help people improve their circumstances are more difficult to design and implement, they are necessary to genuinely understand an individual’s strengths, weaknesses, and opportunities for developing as human beings through digital participation. Importantly, while sophisticated, professionally produced scenario-based learning modules are laudable and arguably ‘best practice’, many community organisations do not have the resources or expertise to administer such surveys.

The Digital Ability Self-Assessment Tool

The Digital Ability Self-Assessment Tool (DASAT) presented here was developed for an Australian state government department for use in community-based workshops and needed to be easily understood and clearly presented in non-digital form. The Tool comprises two printable PDF documents: (1) a one-page self-assessment tool for participants, and (2) a facilitator companion guide. These items are designed to be implemented in paper form in a community setting. A facilitator leads an individual or group through the one-page activity sheet (Appendix A) with reference to the facilitator companion guide (Appendix B). The Tool asks participants, guided by a facilitator, to respond to a series of ‘I can’ capability statements that are grounded in and connected to ‘real world’ activities carried out with digital technologies often available in everyday life, whether at home, school, or at work. In the following sections we outline (1) the theoretical underpinnings of the Tool, (2) the design of the Tool, (3) processes for using the Tool, and (4) the role of the facilitator in assisting users.

Theoretical underpinnings of the DASAT

The Tool is structured as a matrix (see ) with three key components underpinned by three interrelated theoretical frameworks: (1) X-axis: Technology and Tasks based on Skov’s (Citation2016) life spheres; (2) Y-axis: Sophistication of Capabilities based on complexity theory (Warren, Citation2008); and (3) Interactions of X and Y: Digital Capabilities, which are socially and materially situated. We now describe each component in order.

Figure 1. DASAT matrix structure.

Figure 1. DASAT matrix structure.

Horizontal (X) axis: Technology and Tasks

The horizontal axis identifies technology and tasks (comprising devices, software, platforms, and technical tasks), with the following assumptions.

  1. Personal devices: Personal devices and everyday consumer technologies are designed to be accessible and relatively easy to use, depending on the purpose for which they are being used. Therefore, the ability to use them will typically be at the lower end of the scale of digital ability.

  2. Everyday software: Software like mobile apps and computer email programmes are designed to be used with minimal training. Therefore, the ability to use them will typically be at the lower end of the scale of digital ability.

  3. Digital platformsSimple tasks: Digital platforms like Facebook, Instagram, Snapchat, and TikTok may be used to undertake relatively simple or much more complex activities. Therefore, where platforms are used for simpler tasks, they will typically be at the mid-range of the scale of digital ability.

  4. Online consumption, rights, protection, and privacy: Cultural, consumer, and critical awareness of the use of digital technologies includes a range of tasks from relatively straightforward to highly complex. Placing these tasks at the mid-range of the scale of digital ability, in combination with the vertical axis of degrees of complexity, reflects this range of complexity.

  5. Digital platformsComplex activities: Where platforms are used for more complex tasks, they will typically be located at the upper range of the scale of digital ability.

  6. Creative and workplace software: Software like word processing, spreadsheet, presentation, and multimedia production software often require some training and are relatively difficult to use, particularly for professional activities undertaken in the workplace. Therefore, the capability to use them will typically be at the upper end of the scale of digital ability.

  7. Creative and workplace devices: Technologies like 3D printers often require training and are relatively difficult to use, particularly for professional activities undertaken in the workplace. Therefore, the ability to use them will typically be at the upper end of the scale.

Vertical (Y) axis: Sophistication of Capabilities

The vertical axis provides a hierarchy of sophistication of capabilities (comprising creativity, complexity, and problem-solving). These are intended as being typical indicators and will not describe every individual’s experience of digital technologies.

  1. Less creative to more creative: We draw on Robinson’s (Citation2011) definition of creativity as having ‘original ideas that have value’ (p. 3) to determine which capabilities require more and less individual creativity. Using digital technologies requires creativity in two main ways: 1) developing original ideas to troubleshoot or to work out how to achieve something, and 2) developing original ideas to create something new.

  2. Less complex to more complex: We draw on complexity theory which ‘focuses on complex systems involving numerous interacting parts, which often give rise to unexpected order’ (Warren, Citation2008, p. 227) to determine how digital media and technology use may be more or less complex depending on how many interacting parts there are, and how likely it is that unexpected things will occur.

  3. Less problem solving to more problem solving: We draw on the concept of problem-based learning (Boud & Feletti, Citation1997), which suggests individuals learn through overcoming obstacles or finding solutions to problems. Using digital technologies, for example, may require more or less problem solving depending on the complexity of the task at hand.

Interactions of X and Y:Digital Capability Statements)

The Capability Statements (62 in total) are based on Eshet-Alkalai’s (Citation2004, Citation2012) model of digital literacy. The statements focus on six diverse categories of ‘digital capabilities’ (or skills) (Eshet-Alkalai Citation2004, Citation2012; Komlayut & Srivatanakul, Citation2017): photo-visual, re-designing, branching, information, socio-emotional digital skill, and real-time digital skills. These categories cover a range of skills that can be undertaken on a variety of devices and platforms (e.g. communicating emotions in chat rooms using emojis), thus justifying their inclusion in the Tool’s framework. The statements are phrased as ‘I can’ statements, which help users to reflect directly on their prior experiences to determine their understanding of existing skills (as successfully deployed in the Internet Skills Survey and other tools reviewed above). These statements were devised through various rounds of writing, internal testing, and re-writing involving the authors and digital inclusion practitioners from the state government.

Using the DASAT

The DASAT is intended to be heuristic rather than normative. That is, it is intended to provoke reflection and promote dialogue, rather than provide a definitive indication or a scientifically valid achievement level. For ease of use, the Tool is numerically based and structured along horizontal and vertical axes. The ‘simplest’ tasks are placed in the top left of the tool (for instance, ‘I can use a home phone’). The ‘most difficult’ tasks are placed in the bottom right of the tool (for instance, ‘I can use a laptop or desktop computer to write code’). Although the hierarchy of activities is informed by a theoretical logic, all activities on the tool receive the same ‘score’ (2 points) as we did not want to impose assumptions about which activities individuals would find more or less difficult. For instance, due to contextual specificity and personal circumstances, some individuals might know how to fly a drone, but may not be able to use Facebook. Regardless, those participants who can engage confidently in more of the listed activities will receive a higher score and the assumption is that they have a higher level of digital ability than those who can do fewer tasks.

The Tool uses positive capability statements. Even those individuals who consider themselves to be lacking in digital skills will be able to score at least some points, and some may find that they have more digital capabilities than they realised. On the other hand, some individuals who can complete some quite sophisticated digital tasks might recognise that they have less ability in other areas. The Tool assesses digital ability at a single point in time. In practice, people acquire skills sporadically through their interactions with technology and other people. Though digital skills may be taught in more formal settings as a progression from basics to mastery, in less formal, community-based settings digital skills may be acquired as necessary or desired. Therefore, caution should be exercised in using the Tool as a means of tracking progression of specific capabilities. Rather, the Tool could be administered at another moment in time (perhaps after a digital skills development workshop) to gain a general sense of improvement in an individual’s overall digital ability. Using this approach, the Tool may be used as the basis for gathering qualitative data to ascertain shifts in individuals’ abilities, and particularly to help them to set benchmarks and goals for themselves.

DASAT assessment scales

By completing the DASAT, participants will achieve one of the following levels of digital ability (see ‘Tool Legend for Facilitators’ on left side of Appendix A).

  1. Emerging digital ability (2-24): This individual can complete some tasks with digital technologies and/or may require assistance to complete tasks. They may be a reluctant user and may have minimal understanding of social, ethical, and legal contexts in which digital technologies are used.

  2. Emerging to medium digital ability (26-50): This individual is mostly able to independently complete simple tasks with digital technologies but may have more advanced competence in an isolated area of interest or practice. They may be reluctant to undertake digital activities on unfamiliar platforms or software. They may have minimal or partial understanding of social, ethical, and legal contexts in which digital technologies are used.

  3. Medium digital ability (52-74): This individual can independently use a range of public and personal devices and simple software. They have some understanding of social, ethical, and legal contexts in which digital technologies are used. They are capable of more complex activities on social media and may have some creative digital skills.

  4. Medium to high digital ability (76-100): This individual has advanced competence in more than one area of interest or practice. They have well well-developed understandings of social, ethical, and legal contexts in which digital technologies are used and can use creative and workplace software and devices.

  5. High digital ability (100-124): This individual can use a range of devices and can use everyday software with ease for sophisticated purposes. They can use several types of creative and workplace software and devices for complex purposes and can usually solve technological problems. They have well to very well-developed understandings of social, ethical, and legal contexts in which digital technologies are used.

Role of the facilitator

Aside from the feedback individuals may attain about their own digital ability, the main purpose of DASAT is to provide an opportunity for a facilitator to have a conversation with the people using it. This process is supported by the Facilitator’s Companion (see Appendix B), which has two main purposes: (1) demonstrate how each capability statement is aligned with a digital skills category (from Eshet-Alkalai’s framework), and (2) provide further brief and extended examples of how participants could interpret each capability statement to make it more relevant to their circumstance.

It may become apparent as part of the Tool’s assessment process that an individual requires or desires additional training or knowledge to acquire new digital capabilities. In this sense, use of the Tool may provide an aspirational target for future learning. If the Tool is used in a group setting, it is likely individuals will have different kinds of abilities. This provides an opportunity for members of the group to assist those who are less able with specific tasks. That is, it may help to identify existing abilities within the group that can be shared. Completing the DASAT in a group setting may also lead to the design of a programme of activities over several sessions to meet the needs of different participants. In the best examples, this may involve an interest-driven group project where participants can contribute according to their ability levels and where peer mentoring and tutoring become the norm to enable others to acquire new capabilities.

In applying the Tool in community contexts, the facilitator should keep in mind that this is not a strictly diagnostic tool; rather, it is designed to promote conversations among participants and their facilitator/mentor about digital capabilities that are relevant to everyday life – at home, in the community, and at work. These capability statements have been developed for users who are most likely to attend the kinds of digital ability workshops being offered in community settings. Depending on the intended participants, the fascilitator may need to adjust these statements. For instance, a modified set of capabilities may be required for a workshop involving teenagers in comparison to seniors. Participants should also be mindful that the Tool is not an objective test of digital competence; rather, it provides an indication of the participants’ interests and orientations towards different types of digital technologies and activities.

Piloting the DASAT

The above-described version of the Digital Ability Self-Assessment Tool was piloted to ascertain its efficacy in a community context. The pilot activity was approved by the Queensland University of Technology’s (QUT) Human Research Ethics Committee (Ref: LR 2022–5146–8276). The two-hour workshop, facilitated by two of the authors, was hosted in the boardroom of a community organisation. Nine people living with a physical disability and their support workers participated: eight in-person and two using Microsoft Teams (one with video and audio, one with audio only). Participants in the room received physical A3, colour-printed version of the Tool, while online participants received PDF copies. The facilitators provided a brief background to the research and instructions for completing the self-assessment, before asking participants to do a ‘cold run’ assessing their own digital abilities. Participants spontaneously worked in pairs (e.g. a person with a disability and their support person, or people sitting next to each other), ticking off the capabilities they felt confident in and adding up their scores (2 points each). As instructed, participants made notes directly on the A3 sheets (or took digital/handwritten notes if online) and asked clarifying questions of the facilitators throughout. For example, two participants asked what was included in ‘multimedia’, to which the facilitators (using the Facilitator’s Companion) responded: text, audio, video, animations. The facilitators took handwritten notes throughout the workshop and collected the A3 sheets afterwards as data. Data were further supplemented by participants responding to evaluative questions via email after the workshop. Analysis of the data involved considering aspects of the Tool one by one; namely, the overall approach and structure of the Tool, the relevance and presentation of capability statements, and the role of the facilitator and associated materials. Feedback and comments about each aspect was grouped together and the workshop facilitators worked together to thematically analyse the data to arrive at the below findings.

Overall evaluation of the DASAT

Approach and structure

Overall, the Tool and its components ‘made sense’ to participants. Participants were able to navigate and self-assess their own capabilities, determine areas of strength and weakness, and gain a general understanding of their digital ability. Some participants found it helped to award themselves 2 points for capabilities they felt very confident with, and 1 point for capabilities they felt they could demonstrate but which were less familiar to them. However, as was emphasised in the socio-material approach to developing the Tool, participants were not particularly concerned with attaining and comparing normative scores, and found the Tool to be an effective conversation starter for exploring their own digital capabilities as they related to their own circumstances.

The paper-based, A3 sheet was well-received by the participants in the room, while the PDF version used by online participants (not the intended way of using the Tool) presented problems with viewing the whole sheet at once, and printing to a home printer on A4 made the text illegible. This confirmed our decision to present the Tool in printable format for in-person use in community contexts. There was discussion, however, about the possible need to develop a digital or web-based version of the Tool to maximise reach and impact in communities.

Relevance and presentation of capability statements

While participants understood the logic and components that informed the Tool, they suggested clarifications be made to individual capability statements (e.g. some of the terms used in the capability statements, such as ‘difficult’ or ‘complex’, are quite subjective). Participants also suggested incorporating other everyday skills, such as those relating to telehealth, which have been increasingly critical in recent years, and which are particularly relevant to this group who are in frequent contact with health providers. Participants understood and appreciated the inclusion of capabilities of varying degrees of complexity, creativity, and problem solving, as well as their application to various devices. However, participants felt the explicit inclusion of the horizontal and vertical axes in the A3 had several drawbacks. Namely, space is wasted by positioning the 62 digital capacities within a matrix structure, causing the text within each box to be very small and illegible to some, including one vision-impaired participant. Participants also said having all the capability statements on one page was overwhelming.

Finally, suggestions were made for adapting existing, and creating new, capability statements to address more specific needs of people living with a disability. For example, some statements could relate to use of assistive technologies (e.g. a screen reader), digital signatures, Bluetooth for syncing, mobile hotspot for Wi-Fi, and cloud-based platforms and data storage. Results suggest that less emphasis could be placed on gaming technologies (which was not of great relevance to most participants) to make room for health-related technologies (e.g. medical appointment apps and video conferencing). Finally, participants recognised the media and consumer literacy capability statements as important, less obvious skills to include in the Tool.

Facilitator’s role and resourcing

One of the novel features of our Tool is that it includes a Facilitator’s Companion. In the workshop, the facilitator’s role of providing clarification on capability statements was highlighted. In particular, it was noted that participants needed to discuss and clarify the distinction between digital ability and other determinants of their capacity to perform activities. For example, some participants identified they have the digital capability to ‘use a desktop computer to create/play multimedia online’ but do not have access to a reliable, high-speed internet connection to do this in real life. Others pointed out that cost of devices, software, and connections can present barriers to executing digital capabilities. For example, participants identified that while they might have the digital capability to ‘protect their device from viruses’ they may lack the financial resourcing to maintain a subscription to anti-virus software. Furthermore, several participants identified they have the digital capability to ‘use an automatic teller machine’ but some lacked the physical ability to reach the machine (e.g. if they were wheelchair dependent) or navigate the touchscreen interface (e.g. if they have mobility/dexterity impairment).

Results suggested that the Tool could be improved by providing facilitators with additional information and prompts to carefully consider who is in their workshop and to anticipate and plan for the types of contextual factors that may be relevant for an end user’s assessment of their digital ability. Fascilitors could also be supported to design tactics to assist participants to isolate their self-assessment of each digital capability from other factors that might affect their capacity to carry out the task. This approach may enable facilitators to be responsive to diverse needs, instead of the Tool needing to pre-empt the plethora of access, affordability, and ability challenges end users may face. This reinforces the Tool’s intended purpose to promote conversation and pathways to learning, rather than being a diagnostic tool.

Discussion

We have argued that established instrumentalist approaches to measuring and addressing people’s digital inclusion needs have limitations in comparison to socio-material approaches that recognise individuals’ specific circumstances and contexts. In particular, we have shown that most frameworks and tools for assessing digital ability in the community are informed by technical-instrumental approaches, which tend to focus on normative measures of skills attainment that are often too generic to be usefully applied to everyday digital activities. In contrast, we sought to present a Digital Ability Self-Assessment Tool that takes a socio-material approach to understanding and assessing digital ability for digital inclusion. We now critically evaluate how the Tool succeeded in this aim, where improvements can be made, and account for limitations.

First, we attempted to make the Tool context-specific, but our pilot shows that users want it to be even more tailored to their needs so that it can form an effective baseline from which to devise digital learning goals. That is, the pilot emphasises the theoretical point we make that to be of use in community contexts, a digital skills assessment tool needs to be as tailored as possible. While highly generic tools may be more attractive and convenient for policymakers for cross-community benchmarking, generic tools are unlikely to meet the needs and expectations of specific communities because they are too general to be meaningful in local contexts. While ideally a new Tool would be created for every context, in reality, a balance needs to be achieved between a tailored approach and resourcing (which is often stretched in community organisations).

Second, the pilot emphasised that the best use of such tools is to begin conversations and to set future learning goals. It reinforced that digital ability assessment tools should be used in context with facilitators or mentors who can add nuance and value to their use. Ideally, these facilitators will receive some training or support, such as working through The Digital Mentor’s HandbookFootnote6 jointly created by Australia Post and QUT (2018–2019), undertaking training with Be Connected through How to be a Digital Mentor,Footnote7 or doing Infoxchange’s GoDigi Mentors Training programme.Footnote8 Policymakers could, therefore, provide community organisations with packages of resources that help facilitators effectively benchmark and improve participants’ digital ability, thereby improving overall outcomes for communities.

Third, and related to point one, the pilot confirmed that the more context-responsive the tool is, the more effective it will be. Therefore, it would be appropriate to create several versions of the Tool for different audiences in collaboration with community organisations. With continued funding and in collaboration with community partners, our goal would be to co-create several versions of the tool with context specific statements, beginning with a version for people living with a disability.

We acknowledge several limitations of the Tool. First, as an individual-level, personalised intervention, the DASAT may overlook some macro-level influences that end users may not self-identify (such as broader political or cultural norms). However, by emphasising social and material aspects of digital participation, we expressly seek to mitigate such weaknesses that are common to existing digital ability frameworks and tools. Second, our Tool is theory-driven and has not yet been widely tested in the field beyond the pilot, although further in-field testing of revised versions of the Tool may be the focus of future work. Third, we recognise that the Tool represents one possible hierarchy of abilities that may not fit every community or individual’s capabilities or context. The DASAT makes some assumptions about the kinds of activities likely to be undertaken by individuals at various levels of digital ability. Some individuals, however, may have atypical digital media experiences and expectations. Furthermore, individuals may find particular activities more or less difficult for a variety of reasons. Finally, self-reporting by participants with this Tool has some pitfalls. For example, what people say they can do and what people can actually do sometimes differs (Araujo, Wonneberger, Neijens, & de Vreese, Citation2017).

Implications for policy and program development

The development and piloting of the DASAT has shown that it is possible to create contextually-based and socially materially informed indicators of digital ability. In our view, policymakers who are creating benchmarks for digital skills or ability are likely to be more successful where they aim to reflect authentic instances of learning and skills attainment. We recognise that there is a tension between the development of scalable policy solutions and being responsive to individuals’ needs. Nonetheless, this article demonstrates that it is possible and desirable to find solutions that promote authentic goals.

Conclusion

In this paper we presented a self-assessment tool for measuring digital ability in community contexts. Underpinned by a theory-driven, socio-material approach, the Digital Ability Self-Assessment Tool (DASAT) accommodates the assessment of a wide range of activities and technologies in a compact format. This article and the Tool contribute to digital inclusion scholarship and practice by articulating contextualised ways to measure digital skills and may inform strategies for digital skills development that meet the needs of people in Australia and internationally. Our research helps to address the shortcomings of popular, instrumentalist frameworks and tools which, though useful and appropriate for scalability across regions and countries, neglect to conceptualise, assess, and foster digital skills as they are applied by diverse people in local contexts. In response, our socio-material approach provides policy makers, scholars, and practitioners with an alternative, complementary Tool to understand the personalised digital needs and abilities of people to inform interventions that can help bolster meaningful and authentic digital participation.

Supplemental material

Appendix A_Self_assessment_tool_facilitators_Final.pdf

Download PDF (282.2 KB)

Appendix B_Facilitator_companion_final.pdf

Download PDF (289.1 KB)

Acknowledgments

The authors thank Queenslanders with Disability Network for their assistance in testing our Digital Ability Self-Assessment Tool.

Disclosure statement

No potential conflict of interest was reported by the authors(s).

Supplementary data

Supplemental data for this article can be accessed online at https://doi.org/10.1080/22041451.2024.2306573

Additional information

Funding

This research was funded by the Queensland Government and received approval from the QUT Human Ethics Committee.

Notes

2. From 2021, the Internet Skills Survey (ISS) now underpins the measures used by the Australian Digital Inclusion Index (ADII) to assess Digital Ability across the Australian population year-on-year, providing a primary source of insight for Australian digital inclusion digital inclusion scholars and practitioners across sectors and geographies.

References

  • Airola, E., Rasi, P., & Outila, M. (2020). Older people as users and non-users of a video conferencing service for promoting social connectedness and well-being – A case study from Finnish lapland. Educational Gerontology, 46(5), 258–269. doi:10.1080/03601277.2020.1743008
  • Alam, K., & Imran, S. (2015). The digital divide and social inclusion among refugee migrants: A case in regional Australia. Information Technology & People, 28(2), 344–355. doi:10.1108/ITP-04-2014-0083
  • Antonio, A., & Tuffley, D. (2015). Bridging the age-based digital divide. International Journal of Digital Literacy and Digital Competence, 6(3), 1–15. doi:10.4018/IJDLDC.2015070101
  • Araujo, T., Wonneberger, A., Neijens, P., & de Vreese, C. (2017). How much time do you spend online? Understanding and improving the accuracy of self-reported measures of internet use. Communication Methods and Measures, 11(3), 173–190. doi:10.1080/19312458.2017.1317337
  • Australian Government. (2022). Be Connected – improving digital literacy for older Australians. Department of Social Services. https://www.dss.gov.au/seniors/be-connected-improving-digital-literacy-for-older-australians
  • Boud, D., & Feletti, G. (1997). The challenge of problem-based learning. London: Routledge.
  • Calvani, A., Cartelli, A., Fini, A., & Ranieri, M. (2008). Models and instruments for assessing digital competence at school. Journal of E-Learning and Knowledge Society, 4(3), 183–193.
  • Carretero, S., Vuorikari, R., & Punie, Y. (2017). DigComp 2.1: The digital competence framework for citizens with eight proficiency levels and examples of use (EUR 28558 EN). doi:10.2760/38842
  • Collin, P. J., Notley, T., & Third, A. (2018). Cultivating (digital) capacities: A role for social living labs? In M. Dezuanni, M. Foth, K. Mallan, & H. Hughes (Eds.), Digital participation through social living labs (pp. 19–35). Cambridge, MA: Chandos (Elsevier).
  • Covello, S., & Lei, J. (2010). A review of digital literacy assessment instruments. New York: Syracuse University.
  • Cruz-Jesus, F., Vicente, M. R., Bacao, F., & Oliveira, T. (2016). The education-related digital divide: An analysis for the EU-28. Computers in Human Behavior, 56, 72–82. doi:10.1016/j.chb.2015.11.027
  • de Reynal, L., & Richtor, B. (2016). Stepping into digital life: The digital skills observatory research report. Kenya: Digital Skills Observatory/Mozilla. Retrieved from http://mozillafoundation.github.io/digital-skills-observatory/
  • Department for Education, United Kingdom. (2019). Essential Digital Skills Framework. Retrieved from https://www.gov.uk/government/publications/essential-digital-skills-framework/essential-digital-skills-framework
  • Dezuanni, M. (2015). The building blocks of digital media literacy: Socio-material participation and the production of media knowledge. Journal of Curriculum Studies, 47(3), 416–439. doi:10.1080/00220272.2014.966152
  • Dezuanni, M., Burgess, J., Mitchell, P., Marshall, A., & Cross, A. (2018). Measuring and evaluating digital ability for digital inclusion in Queensland: A report for the Queensland Department of housing and public works. QUT Digital Media Research Centre.
  • Ellis, K., Goggin, G., Haller, B., & Curtis, R. (Eds.). (2020). The Routledge companion to disability and media. New York: Routledge.
  • Eshet-Alkalai, Y. (2004). Digital literacy: A conceptual framework for survival skills in the Digital Era. Journal of Educational Multimedia and Hypermedia, 13(1), 93–106.
  • Eshet-Alkalai, Y. (2012). Thinking in the digital era: A revised model for digital literacy. Issues in Informing Science and Information Technology, 9(2), 267–276. doi:10.28945/1621
  • Fellows, M., Davis, K., & Russell-Sauve, C. (2018). Learning and leading: An evaluation of the digital skills for digital librarians project. Seattle: Technology & Social Change Group, University of Washington Information School.
  • Ferrari, A. (2013). DIGCOMP: A framework for developing and understanding digital competence in Europe (EUR 26035 EN). Joint Research Centre. Retrieved from http://digcomp.org.pl/wp-content/uploads/2016/07/DIGCOMP-1.0-2013.pdf
  • Freeman, J., & Park, S. (2015). Rural realities: Digital communication challenges for rural Australian local governments. Transforming Government: People, Process and Policy, 9(4), 465–479. doi:10.1108/TG-03-2015-0012
  • Goggin, G. (2021). Disability, Internet, and digital inequality: The research agenda. In E. Hargittai (Ed.), Handbook on digital inequality (pp. 255–273). London: Edward Elgar Publishing.
  • Hargittai, E. (2002). Second-level digital divide: Differences in People’s online skills. First Monday, 7(4). doi:10.5210/fm.v7i4.942
  • Hargittai, E. (Ed.). (2021). Handbook of digital inequality. London: Edward Elgar Publishing.
  • Heath, S. B. (1983). Ways with words: Language, life and work in communities and classrooms. Cambridge: Cambridge University Press.
  • Helsper, E. (2008). Digital inclusion: An analysis of social disadvantage and the information society. London: Department for Communities and Local Government.
  • Humphrey, J. (2022). Homelessness and mobile communication: Precariously connected. Singapore: Springer Nature.
  • Ilomäki, L., Paavola, S., Lakkala, M., & Kantosalo, A. (2016). Digital competence – An emergent boundary concept for policy and educational research. Education and Information Technologies, 21(3), 655–679. doi:10.1007/s10639-014-9346-4
  • Jackman, L. W., & Jones, L. D. (2002). Information literacy, Information Communications Technologies (ICT) and the Nongovernmental Organization (Ngo)/non-profit world: A practitioner’s perspective. Information Literacy Meeting of Experts. Prague, The Czech Republic: UNESCO, the U.S National Commission on Libraries and Information Science, and the National Forum on Information Literacy. Retrieved from https://pdfs.semanticscholar.org/4d1b/77cc90f9aafa1e3cffca90e3ea15fb2289db.pdf
  • Jobs and Skills Australia. (2023). JSA foundation skills study. Australian Government. Retrieved from https://www.jobsandskills.gov.au/sites/default/files/2023-04/Foundation%20Skills%20Study%20-%20Discussion%20paper.pdf
  • Katz, I. R. (2005). Beyond technical competence: Literacy in information and communication technology. Educational Technology, 45(6), 44–47.
  • Komlayut, S., & Srivatanakul, T. (2017). Assessing digital literacy skills using a self- Administered questionnaire. Review of Integrative Business and Economics Research, 6(3), 74–85. Retrieved from http://buscompress.com/index.html
  • Lankshear, C., & Knobel, M. (2008). Introduction. In C. Lankshear & M. Knobel (Eds.), Digital literacies: Concepts, policies and practices (pp. 1–16). New York: Peter Lang.
  • Lee, H., Lee, S. H., & Choi, J. A. (2016). Redefining digital poverty: A study on target changes of the digital divide survey for disabilities, low-income and elders. Journal of Digital Convergence, 14(3), 1–12. doi:10.14400/JDC.2016.14.3.1
  • Livingstone, S., & Helsper, E. (2007). Gradations in digital inclusion: Children, young people and the digital divide. New Media & Society, 9(4), 671–696. doi:10.1177/1461444807080335
  • Macdonald, S. J., & Clayton, J. (2013). Back to the future, disability and the digital divide. Disability & Society, 28(5), 702–718. doi:10.1080/09687599.2012.732538
  • Martin, A. (2003). Towards E-Literacy. In A. Martin & H. Rader (Eds.), Information and IT literacy: Enabling learning in the 21st Century (pp. 3–23). London: Facet.
  • Mossberger, K., Tolbert, C. J., & McNeal, R. S. (2007). Digital citizenship: The Internet, society, and participation. Cambridge, MA: MIT Press.
  • Olphert, C. W., Damodaran, L., & May, A. J. (2005, August). Towards digital inclusion – Engaging older people in the ‘Digital World’. Accessible design in the digital world conference, Dundee, Scotland (pp. 1–7).
  • Parsons, C., & Hick, S. F. (2008). Moving from the digital divide to digital inclusion. Currents: Scholarship in the Human Services, 7(2), 1–16.
  • Pirzada, K., & Khan, F. (2013). Measuring relationship between digital skills and employability. European Journal of Business and Management, 5(24), 124–134.
  • Poveda Villalba, S. C. (2015). Conscientisation and Human Development: The Case of Digital Inclusion Programmes in Brazil [ PhD diss.,]. Royal Holloway University of London.
  • Ragnedda, M., & Mutsvairo, B. (eds.). (2018). Digital inclusion: An international comparative analysis. London: Rowman & Littlefield.
  • Robinson, K. (2011). Out of their minds: Learning to be creative (rev. ed.). Wiley: West Sussex.
  • Scheerder, A., van Deursen, A., & van Dijk, J. (2017). Determinants of Internet skills, uses and outcomes. A systematic review of the second- and third-level digital divide. Telematics and Informatics, 34(8), 1607–1624. doi:10.1016/j.tele.2017.07.007
  • Seale, J., Draffan, E. A., & Wald, M. (2010). Digital agility and digital decision‐making: Conceptualising digital inclusion in the context of disabled Learners in higher education. Studies in Higher Education, 35(4), 445–461. doi:10.1080/03075070903131628
  • Selwyn, N. (2004). Reconsidering political and popular understandings of the digital divide. New Media & Society, 6(3), 341–362. doi:10.1177/1461444804042519
  • Selwyn, N., & Facer, K. (2013). Beyond digital divide: Toward an Agenda for change. In Digital literacy: Concepts, methodologies, tools, and applications (pp. 1678–1696). IGI Global.
  • Skov, A. (2016). What is digital competence? The digital competence wheel. Copenhagen, Denmark: Centre for Digital Dannelse. Retrieved from https://digital-competence.eu/dc/front/what-is-digital-competence/
  • Thomas, J., Barraket, J., Parkinson, S., Wilson, S., Holcombe-James, I. … Brydon, A. (2021). Australian digital inclusion index: 2021. Melbourne: RMIT, Swinburne University of Technology, and Telstra. doi:10.25916/phgw-b725
  • van Deursen, A., Helsper, E. J., & Eynon, R. (2016). Development and validation of the Internet Skills Scale (ISS). Information, Communication & Society, 19(6), 804–823. doi:10.1080/1369118X.2015.1078834
  • van Deursen, A., & van Dijk, J. (2019). The first-level digital divide shifts from inequalities in physical access to inequalities in material access. New Media & Society, 21(2), 354–375. doi:10.1177/1461444818797082
  • van Dijk, J. (2020). The digital divide. Cambridge: John Wiley & Sons.
  • Warren, K. (2008). Chaos theory and complexity theory. In T. Mizrahi & L. Davis (Eds.), Encyclopedia of social work (20th ed.). Oxford, UK: Oxford University Press. doi:10.1093/acref/9780195306613.001.0001
  • Warschauer, M. (2002). Reconceptualizing the digital divide. First Monday, 7(7). doi:10.5210/fm.v7i7.967

Appendices

Appendix A.

Digital Ability Self-Assessment Tool

Appendix B.

Facilitator Companion