718
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Sociodigital futures of education: reparations, sovereignty, care, and democratisation

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon

ABSTRACT

As EdTech industries grow in reach and power it is imperative to motivate conditions for ethical challenge and contestation, always remaining attentive to the kinds of education futures that dominant imaginaries of technology foreclose. In this paper, we explore how the multiple lenses of reparations, sovereignty, care and democratisation can offer resources for envisaging alternative sociodigital futures of education. We identify how these ideas can disrupt dominant EdTech modalities, exploring how they foreground different kinds of educational relationships and priorities for education/social justice. The paper explores examples of how sociodigital futures-in-the-making have begun to materialise in a range of locations and how they urge new agendas for research, redesign and regulation in relation to EdTech. Whilst the power of the EdTech industry can be overwhelming, we suggest that critiques work from a position of abundance: there are always already many ways to radically reimagine sociodigital futures of education. We argue for the importance of recognising, surfacing, and working with these potentialities in ongoing debates about EdTech precisely to keep the future of education open.

Introduction

Digital technology has long been treated in a deterministic fashion, as inevitably steering the priorities, practices and even the promissory futures of education. Indeed, the future of education is increasingly presented by education technology companies and their advocates as one in which AI and datafication will both solve longstanding educational problems – from the global ‘learning gap’ to the lack of highly qualified teachers – and become thoroughly embedded into the everyday practices of education: teaching content and delivering lessons; personalising learning and feedback; attendance and behaviour monitoring; offering online courses and assessment; shaping educational governance, administration, finance, research, and so on (Gulson et al., Citation2022; Selwyn et al., Citation2015; UNESCO, Citation2023).

The World Bank, for example, while recognising that the impact on reducing ‘learning poverty’ has been at best mixed, advocates that EdTech will be essential for training students in ‘21st Century skills’. It highlights how EdTech will enable data-driven and evidence-based educational decision making through analysis of the ‘digital footprints’ of individuals, schools and systems, and will foster better connections, relationships, motivation and engagement (World Bank, Citation2020). Relatedly, the OECD proposes that EdTech, such as adaptive learning technologies, will be able to ‘detect knowledge gaps’, ‘diagnose next steps for learning’ and ‘factor in behavioural dimensions’. This implies the value that it sees in automated processes to maximise learning (OECD, Citation2021). And EdTech trade publications have emphasised the role of learning analytics in identifying absenteeism and low grades to inform ‘corrective action’ by teachers (Agrawal, Citation2022).

Artificial Intelligence (AI), learning analytics, cloud-based platforms, and virtual learning environments are not just technologies that are inserted into educational institutions to fulfil existing educational intentions, but should be understood as sociodigital practices which fundamentally remake educational relationships and values: changing the way we relate to each other; what we understand learning to be; how we envisage the learner; and what we envisage as the purpose of education. Their introduction is not simply a question of educational efficiency, then, but of educational policy and values. Advocates of EdTech frequently refer to the transformational and revolutionary potential of digital technologies. Often, in their claims to address educational issues, they appear to occupy a progressive ground which appeals to ideals of inclusion, accessibility and equity.

Yet, we suggest the sociodigital practices being promoted by the EdTech industry – driven by for-profit industries and dominated by a rationale of educational efficiency and attainment – increasingly serve value extraction and industry expansion above the needs of educational communities or the principles of education as a common good. Principles of data extraction and enclosure of the digital commons govern these trajectories (Williamson et al., Citation2022). As the industry grows in reach and power, driven by estimates of multibillion dollar market opportunities (Davies et al., Citation2022), we must be clear-eyed about the ethical risks of such sociodigital practices for education as well as about the kinds of educational futures that are being closed down by such approaches to EdTech (Heath et al., Citation2023; Swist & Gulson, Citation2023).

We write this paper at a time when political actors, states and agencies are debating the dangers of such technologies, particularly in the face of the fast-moving AI industry.Footnote1 There is increasing public and political recognition of issues around privacy, data ownership and digital footprints, as well as the powerful monopolies of cloud service providers (mainly US- and China-based tech businesses such as Amazon Web Services, Microsoft Azure, Google Web Platform, Huawei Cloud, Alibaba Cloud). Within education, critical scholars of EdTech have drawn attention to ethical challenges raised by these developments: in relation to digital divides and exclusions (Hakimi et al., Citation2021), the scripting of pedagogic approaches and content via platforms, models and automated learning (Selwyn et al., Citation2023), and the monetisation of educational data (Williamson & Komljenovic, Citation2022).

Political responses to such concerns tend to focus on the need for greater regulation of the EdTech industry, rather than a deeper interrogation of the assumptions that have underpinned and legitimised the industry’s expansion to date – namely, that education is being viewed as a site of value extraction for technology markets (Davies et al., Citation2022; Krutka et al., Citation2021). For example, in March 2023 the UK’s Digital Futures Commission launched a ‘practical framework’ designed to tackle privacy in digitalised school education through three priorities: regulation of EdTech providers, certification of EdTech providers, and creating new data-sharing infrastructures.Footnote2 Regulation, openness and fairness of algorithms are often seen as sufficient responses to ethical concerns raised about EdTech. However, we argue that such responses do not go far enough and can in fact exacerbate injustices in failing to expand our horizons of what sociodigital futures of education could and should become.

Thus, in this paper, we take a different approach. Whilst the power of the EdTech industry can be overwhelming, we suggest, after Gibson-Graham, that research can and should start from a position of abundance: from recognising the many, already existing ways in which educators, developers and educational activists are working to reimagine and reconstruct sociodigital futures of education (Gibson-Graham, Citation2008). We argue that it is now time for engaged, collaborative research in partnership with activists, educators and academics who are already challenging extractive imaginaries of sociodigital futures in education and opening up alternatives. Our aim in drawing attention to these, rather than continuing to narrate the dominance of the EdTech status quo, is to render these alternative trajectories ‘more “real”, more credible, more viable as objects of policy and activism, more present as everyday realities’ (Gibson-Graham, Citation2008, p. 6).

Our discussions are by no means a complete catalogue of such rich alternatives; we recognise that resistances, refusals, and remakings of all kinds occur in many different spaces – often in undocumented ways, through everyday action, or by fugitive means. There is a growing body of literature that is paying attention to activist contestations with technology. For example, in their book, Algorithms of Resistance: The Everyday Fight against Platform Power, Tiziano Bonini and Emiliano Treré (Citation2024) explore the inventive ways in which workers, influencers and activists across the global north and south develop ‘tactics of algorithmic resistance’ to challenge the moral economy of tech industries. Similarly, Firuzeh Shokooh Valle’s (Citation2023) book, In Defense of Solidarity and Pleasure: Feminist Technopolitics from the Global South, explores how women in the global south are resisting the narratives of technosolutionism through collective politics that prioritise praxes of solidarity and care. Following such contributions, our intention is to point readers to the trajectories for both generative critique and action that resistances and refusals are already opening up.

In the rest of this paper we discuss how four entwined concepts – reparation, sovereignty, care, and democratisation – are being mobilised by social movements, educators and scholars as conceptual tools to oppose the injustice, extraction, thoughtlessness, enclosure and dispossession evident within dominant trajectories for EdTech. We examine how activists and scholars are working with these ideas to open up new pathways for sociodigital imaginaries, disrupting dominant EdTech modalities by foregrounding different kinds of educational relationships and establishing alternative priorities for education and social justice. We conclude by considering a set of examples where such practices are being put into action and discuss what they might tell us about productive trajectories for research in the field of sociodigital futures of education.

While these are not the only four concepts that can help us think through alternative sociodigital futures of education, we have chosen them as they offer active counters to the dominant features of EdTech: extraction and dispossession (counterpolitics: reparation); imposed control and ownership (counterpolitics: sovereignty); efficiency and dehumanisation (counterpolitics: care); top-down, privatised power (counterpolitics: democratisation). We suggest there is much to be learned from the intersecting social movements that give rise to these counterpolitics – namely, Indigenous, Black and feminist struggles. But we also caution against these politics being decontextualised and co-opted by EdTech research and practice, folded into dominant discourses without altering the material conditions of sociodigital relations.

Conceptual tools to imagine alternatives

Reparations

What would technology for a just future of education look like? Much has been written on the ways in which sociodigital systems reproduce existing hierarchies and patterns of power and marginalisation, including in education (Crooks, Citation2021; Macgilchrist, Citation2019; Perrotta, Citation2022). For example, Madisson Whitman (Citation2020) explores how data about students are actively made and used within the US university system through the categorisation and ordering not only of student demographic information but also their ‘behavioural’ attributes. Whitman’s ethnographic analysis shows how social assumptions are built into the categorisation of student behaviours, which are then stabilised (normalised and made invisible) within apparently neutral data analytics of student progress and performance. This can in effect hinder institutional efforts to address social inequalities, which are – through these data practices – rendered as behavioural issues subject only to personal nudges rather than requiring cultural change. This has the potential to exacerbate the well-known sense that members of social groups who are already structurally marginalised (for example by their social class, ethnicity or minoritised status) have that they must work harder to achieve the same educational results. In turn this can lead to higher levels of stress and anxiety in educational settings for such groups, who are at the same time encouraged to view themselves only as individuals. The harms produced through such making and using of data in education systems can be profound; they can be both allocative and representational, having real effects on student access and participation in education. As Dan McQuillan reflects on AI, such computational processes have ‘an inbuilt political commitment to the status quo, in particular to existing structures that embed specific relations of power’ (McQuillan, Citation2022, p. 43).

The increasing recognition of the constitutive injustice of such sociodigital systems has led to responses such as ‘equitable AI’ and ‘fair machine learning’. This work tends to maintain that through better computational procedures it is possible to neutralise the assumptions and hierarchies that can be baked into data making and analysis (Knox Citation2022). However, Jenny Davis, Apryl Williams and Michael Yang (Citation2021) have warned of the conceptual limitations of such appeals to fairness and equity in AI. They suggest that such approaches tend to be steeped in a kind of ‘algorithmic idealism’ – the assumption that there is a meritocratic society which can be uncovered through ‘de-biased’ computational procedures. Even though the ‘social’ in the ‘sociotechnical’ may be acknowledged in such approaches, the idealism of ‘equitable AI’ and ‘fair machine learning’ renders the social as flat rather than as deeply shaped by power and inequality. This idealism, the authors argue, ‘will always be inadequate in a context that is fundamentally unjust’ (Davis et al., Citation2021, p. 2). This leads the authors to urge for a concept and practice of AI and data analytics that takes seriously the realities of the present world: ‘a world in which discrimination is entrenched, elemental and compounding at the intersections of multiple marginalizations’ (Davis et al., Citation2021, p. 3). From this, Davis and colleagues propose a concept of ‘algorithmic reparations’ that we find particularly generative.

Reparation, in general terms, involves recognising and making amends for historical wrong-doing and ensuring the non-recurrence of those harms. It is a framework for addressing the injustices that are usually associated with direct and structural forms of state violence (for example, invasion and slavery; as well as their enduring afterlives – such as racism and poverty). Such violence can be systematised within education and also reproduced through it, pointing to the importance of a reparative lens to matters of educational justice (Sriprakash, Citation2022). Such existing forms of injustice can also be reproduced through digital technologies via forms of ‘algorithmic violence’ (McQuillan, Citation2023). Equally, as Safiya Noble has argued, tech industries have been actively complicit in perpetuating anti-Black racism: we need Big Tech to come to the table and be held to account for its harms […] they owe a large debt to society, and that could be part of the resources we use’ (Noble, quoted in Oremus, Citation2020).

While the idea of reparations is attentive to past and present injustice, this does not mean it is purely ‘backward looking’. Indeed, Olúfẹ́mi O. Táíwò has recently argued that reparation is a ‘future making’ approach (Táíwò, Citation2022). This perspective is shared by scholars working on ‘algorithmic reparations’ who understand reparation as a reconstructive agenda for the making of just futures, precisely because algorithmic reform can create new material, representational and recognitional opportunity structures (Davis et al., Citation2021).

The idea of algorithmic reparation is to deliberately incorporate redress ‘into the assemblage of technologies that interweave macro institutions and micro-interactions’ (Davis et al., Citation2021, p. 4). Embedding principles of reparative justice into sociotechnical systems and their computational procedures would profoundly reshape educational and social life – how categories are made and sorted, how resources are distributed, how priorities are created, and so on. As Davis and colleagues make clear, ‘our proposal for algorithmic reparation assumes a moral duty to ameliorate, rather than aggravate, structural and historical stratifications as they manifest in computational code’ (Davis et al., Citation2021, p. 4). This is a major departure from the ‘de-biasing’ impetus in strands of equitable AI and fair machine learning. In the reparative framework, just outcomes are held as the evaluative standard for AI rather than chasing the mythical ‘neutrality’ of inputs. They provide examples such as recruitment and hiring algorithms which statistically value the contributions of women, trans, and non-binary individuals rather than reinforce hiring biases, as well as algorithms designed to reduce incarceration of poor Black men rather than those which are used as predictive tools of risk which self-perpetuate racial discrimination.

Importantly, Davis et al. recognise that a reparative approach requires a departure from technosolutionism, seen in the guise of the computer science movement of ‘fair machine learning’, which they argue suffers from algorithmic idealism. Technosolutionism supposes that perfect statistical procedures can themselves address human biases and discrimination. Instead, they argue that there are some social problems for which AI cannot be the answer, whether ‘reparative’ or not. In such cases, sociodigital systems must be dismantled rather than worked with or worked upon. The principle of algorithmic reparation, then, aims to not only build better sociodigital systems but hold existing ones to account (Davis et al., Citation2021).

Sovereignty

The ‘datafication’ of education refers to the extensive and increasingly intimate production and manipulation of information about educational actors, systems and processes which in turn steers and reshapes the practice and governance of education (Williamson, Citation2019). Educational businesses and venture capital firms are making significant investments, both imaginative and financial, in the technological materialisations and infrastructures of datafication in education; a project of algorithmic futuring of global reach and scale (Davies et al., Citation2022). There are numerous ethical problems one can identify in the intentions and operations of such data technology industries in education – not least questions about the ownership of data in the context of such large-scale extractive enterprises (Yu & Couldry, Citation2022). In this section we reflect briefly on the idea of data sovereignty and its ethical orientations for remaking sociodigital futures. In particular we set out to learn from the ongoing work by Indigenous collectives globally who have been forging a politics and practice of ‘Indigenous Data Sovereignty’, reflecting on the potential affordances of these principles for reconstructing the role and use of data in education.

The idea of Indigenous Data Sovereignty (IDS) is connected to an active history of Indigenous-led transnational movements which seek to address the injustice of state (and increasingly corporate) determination over the collection and use of data about Indigenous people (GIDA – Global Indigenous Data Alliance, Citation2022). Across the world, the ongoing colonial project of dispossessing Indigenous people has involved multiple forms of epistemic violence, not least the erasures and misrepresentations of Indigenous people, knowledges and experiences as well as the use of data to control and contain Indigenous people and uphold structures of colonial dominance (Tuhiwai-Moreton-Robinson, Citation2015; Smith, Citation2021). The Indigenous Data Sovereignty movement recognises that such data ‘neither reflect Indigenous realities nor provide the requisite data resources for Indigenous communities and First Nations to fully participate in determining our own futures’ (Walter et al., Citation2021, p. 144). In particular, IDS seeks to challenge the dominant data narrative that is constructed around Indigenous people, which has been premised on deficit assumptions: disparity, deprivation, disadvantage, dysfunction and difference (Prehn, Citation2022; Walter et al., Citation2021). The investment in such deficit-oriented data has been profound in education, producing what education scholars have called the ‘gap-discourse’ in which Indigenous and other marginalised students are perpetually constructed as ‘lacking’ and needing to ‘catch up’, instead of using data to reveal the structures that create such injustices and to generate transformative modes of redress (Duncan, Citation2005; Ladson-Billings, Citation2006; Rudolph, Citation2019).

Indeed, the Indigenous Data Sovereignty movement is not ‘against’ the production and use of data. Rather it puts forward important frameworks for ensuring, in the face of enduring coloniality, Indigenous sovereignty and Indigenous futures within both small- and large-scale systems of datafication. It calls upon existing and emerging data infrastructures to ‘recognise Indigenous agency and worldviews [and] consider Indigenous data needs’ (Walter et al., Citation2021, p. 143). At its core, Indigenous Data Sovereignty ‘refers to the right of Indigenous peoples to exercise ownership over Indigenous Data. Ownership of data can be expressed through the creation, collection, access, analysis, interpretation, management, dissemination, and reuse of Indigenous data’ (Prehn, Citation2022, p. 36). This is to centre principles of self-determination in Indigenous educational futures, actively resisting the assimilationist forces of educational practice and governance. For example, the Global Indigenous Data Alliance has put forward the CARE principles for Indigenous Data Governance which set out commitments to: design data ecosystems that enable Indigenous people to derive collective benefit (C); ensure Indigenous people have the authority to control the full governance of data (A); demonstrate responsibility for how data ecosystems are collectively benefitting Indigenous people and supporting Indigenous self-determination (R); and centre Indigenous wellbeing and rights in all data-activities and infrastructures as a guiding ethics (E).Footnote3

How might the CARE principles help re-imagine and re-create different kinds of infrastructures for sociodigital futures in education? What kinds of data might be made and used to support Indigenous futures in and beyond education? Here, we can also look to the statement of ‘Indigenous data needs’ put forward by Maggie Walter and colleagues who specifically consider the ethical issues surrounding data for Aboriginal and Torres Strait Islander people in the context of Australia. They set out the following, in response to the absence of such orientations to data in social policy and research in Australia:

In pointing to the importance of the Indigenous Data Sovereignty movement here, our aim (particularly as non-Indigenous scholars) is not to extract from this work to produce a generalised approach to data sovereignty. Rather, we are interested in how imaginaries and infrastructures of sociodigital practices in education might seriously consider the principles of Indigenous Data Sovereignty in its own terms and, therefore, as foundational to frameworks for data justice more broadly.

Care

Educational technology in its various forms is often framed as having the capacity to work efficiently at significant scale. However, we must be attentive to that which ‘can become lost and uncared for in the process’ (McQuillan, Citation2022, p. 26). Think, for example, about the rationale for automated decision-making systems in schools – such as the use of facial recognition technology for roll-calls. The logic of administrative efficiency creates a demand for new forms of data collection and use in schooling, over and above, for example, educational exchanges that are premised on student-teacher reciprocity or expressions of interpersonal care (Gulson & Witzenberger, Citation2022; Selwyn, Citation2022).

Automated technology and datafication in education can be ‘directly’ harmful but perhaps more often than not, its effects are prosaic: ‘the threat of AI is not the substitution of humans by machines but the computational extension of existing social automatism and thoughtlessness’ (McQuillan, Citation2022, p. 63, emphasis added). Such systems are, by virtue of their detachment and abstraction, constitutively thoughtless systems. Scholars such as Dan McQuillan have therefore argued that care is a vital ‘counterproject’ to AI. Operating as an ‘epistemological corrective’, the idea of care ‘directs attention to situated vulnerability and dependency’ – disallowing and refuting the detachment that AI otherwise rests upon (McQuillan, Citation2022, p. 115). This is an idea that Neil Selwyn has recently explored too, in his argument for a ‘degrowth’ approach to educational technology. In this view, care, alongside conviviality, commoning, and autonomy, is a principle for sustainable educational technology (Selwyn, Citation2023). Care, in this perspective, encompasses both care of resources as well as care for others – emerging from a recognition of what Arturo Escobar has examined as our ‘radical interdependence’ (Escobar, Citation2018). How might education and indeed educational futures be framed as matters of care? What might emerge from centring care for, and care in, education through sociodigital practices?

Of course, much can be done in the name of ‘care’ – it is not a neutral or innocent term, but rather a practice that itself is shaped by power relations – the power to care or not; the power to define what counts as care itself. As Martin et al. (Citation2015) write of the politics of care in technoscience, ‘Care is a selective mode of attention: it circumscribes and cherishes some things, lives, or phenomena as its objects. In the process, it excludes others’ (Martin et al., Citation2015, p. 627). Being action-oriented, and always contested, care is a practice which involves work as well as responsibility (Puig de la Bellacasa, Citation2011). Attentive to these politics, the lens of care perhaps has the potential to bolster and extend commitments to ‘responsible AI’, ‘ethics in AI’ and ‘algorithmic accountability’ in education (Ada Lovelace Institute, Citation2022). The idea of care requires a commitment to ongoing attentiveness and thoughtfulness. Care for, and care in, education cannot be achieved through computational detachment; it requires sociodigital systems to actively work against neglect, mistreatment or disregard, particularly where these have been systematised. In this sense, care offers an orientation to justice that is arguably otherwise missing from dominant EdTech imaginaries.

Irina Zakharova and Juliane Jarke (Citation2022) draw on a feminist ethics of care to explore how care can work with, through, and against EdTech. They examine how different modes of care, formed through specific relations of power, are also shaped through and brought into relation with different data practices and technological systems in educational settings. This allows them to trace how educational technologies can be antagonistic to care (for example, being in conflict with other school actors’ modes of care); intermediaries to care (a conduit for school actors’ care practices); and recipients of care or a means to receive care in schools (in the sense that technology is co-constitutive of care practices, rather than separate to them) (Zakharova & Jarke, Citation2022). Reflecting on this literature, care-full (rather than care-less) educational technologies would be alive to the ways sociodigital practices shape the very form and purpose of education. Care-full EdTech is responsible for its educational actions, not least by refusing neglectful, discriminatory, or thoughtless design.

Democratisation

The twinned forces of enclosure and dispossession inherent to capitalist systems are, unsurprisingly, constitutive features of the expansion of the EdTech industry. These forces are both material and virtual. Digital enclosure, upon which the value extraction and wealth accumulation of the industry turns, occurs through the denial of ownership of and access to data – as identified, for example, by the Indigenous Data Sovereignty movement discussed above. The material forces of dispossession are not always visible to ‘end users’ of EdTech, but it is important to understand how datafication depends on extractive labour practices, often gendered, racialised and in the global south (McQuillan, Citation2022; Perrigo, Citation2023; Williams et al., Citation2022) as well as the ways the industry’s processing and storage of data requires extracting from land and natural resources, with explosive social and environmental costs (Selwyn, Citation2022; Unesco, Citation2023). In addition, there is arguably another form of enclosure/dispossession occurring through the expansion of educational technology – that of owning and controlling the idea of the future of education itself (Williamson & Komljenovic, Citation2022). If futures of education are to be democratically made, rather than delegated by capitalist EdTech, then what kinds of practices (sociodigital or otherwise) might we envisage and invest in?

The idea of ‘technical democracy’ has received some interest in recent years in academic analyses of EdTech, influenced by Science and Technology Studies scholars Michel Callon and colleagues (Callon et al., Citation2009). Take, for example, the way in which sociodigital controversies, such as current debates around AI, are events around which publics are engaged in debate and dissensus. A technical democracy is achieved when techno-scientific knowledge and decision-making are deliberated upon by ordinary citizens, non-experts, powerholders and experts alike, rather than being kept predominantly in the domain of specialists and elites. ‘Technical democracy is a forum for introducing uncertainty into what appears closed or settled’ (Thompson et al., Citation2022, p. 4).

Indeed, there is a strong thrust in this idea to create the conditions for democratisation and participatory justice in sociodigital practices (see also Perrotta, Citation2022). Actively creating such spaces (what those working in this tradition call ‘hybrid forums’) with teachers, students, policy makers, industries and other actors enmeshed in EdTech – including those whose material exploitation/dispossession is at stake – is an investment in the idea that sociodigital futures of education are always in the making: they are open and contestable rather than inevitable and closed. Such efforts to theorise and create forums for more democratic experimentation of sociodigital futures of education do not assume that consensus or neat solutions are the goals of dialogue among different constituents. Instead, such forums would seek to ‘unsettle’ the workings of EdTech to turn ‘complex sociotechnical contestations into matters of public concern’ (Perrotta, Citation2022, p. 196).

It is of course naïve and potentially harmful to assume that ‘dialogue’ and ‘participation’ in the name of democratisation are panaceas for the injustices of EdTech, overlooking how modes of exchange and deliberation are themselves saturated by power inequities.

This has led some scholars to underline the importance of dissensus, plurality and agonism as constitutive features of democratising sociodigital futures of education. A ‘hybrid forum’ that had a more thoroughgoing goal of interrogating the ethics and purposes of educational technologies would see ‘conflict and disagreement as generative in the pursuit of creating better systems and tools of practice’ (Holloway et al., Citation2022, p. 262). This is what Holloway and colleagues call ‘technical agonism’, drawing on Mouffe’s (Citation1999) theories of agonistic pluralism which understands the rejection of dissent as undemocratic. In this sense, technical agonism constitutes the conditions for the necessary ongoing contestation over sociodigital futures of education: ‘the hybrid forum does not contain the conflict, nor does it resolve it. Rather, it seeks to open new lines of inquiry that are continually subject to scrutiny and challenge’ (Holloway et al. Citation2022, p. 262).

Agonistic politics can play another kind of role in the democratisation of sociodigital practices. Roderic Crooks and Morgan Currie have examined the work of data activists and the ‘double-bind’ they often find themselves in. Data activism can take many forms – from resisting and evading datafication to harnessing data for political intervention and social movement organising (Veale, Citation2022). For the latter, Crooks and Currie observe how data activism can often lead to minoritised communities legitimising the ‘solutionism’ of data practices, perpetuating their potential injustices (such as bringing about heightened visibility or surveillance), and responsibilising communities to document harms that are already known to states and themselves. ‘In effect, community-based or participatory data activist projects produce benefits that more easily accrue to elites, experts, professionals, and data workers rather than to community members themselves’ (Crooks & Currie, Citation2021, p. 206).

In response they suggest an approach within data activism, coining the term ‘agonistic data practices’, which recognises that data can mobilise collective action, carrying affective and narrative valence (rather than simply ‘rationality’), and can therefore be used to motivate political contestation rather than be oriented towards resolution. As Crooks and Currie suggest, ‘agonistic data practices do not presume that data will lead to more equitable consensus in representative government or to a more rational debate in the public sphere; instead, agonistic data practices mobilise the antagonisms that motivate people to act, to imagine alternative political arrangements, and to contribute to long-term collective action’ (Crooks & Currie, Citation2021, p. 201). This might, for example, involve communities generating different accounts of a sociodigital concern or intentionally constructing alternative narratives which stimulate debate. Using agonistic data practices to surface the antagonisms of EdTech can be generative for motivating different kinds of sociodigital futures in education.

Making alternative futures in practice

Futures of education are not inevitable, as many technology-determinist articulations suggest, but are always ‘in the making’ through various interventions and practices. This making, however, requires intentional work. As Keri Facer and Neil Selwyn (Citation2022) argue, despite several decades of optimistic discourse about the transformative potential of EdTech, it has generally failed to adequately address or ameliorate long-standing patterns of educational inequality in terms of opportunities and outcomes. Instead, creating more sustainable and just educational futures requires policy, technology industry and education actors to ‘look beyond the charismatic allure of the “techno-fix”’ and to work towards forms of technology design and use ‘that can support and sustain the longstanding and hard work of addressing the social and material obstacles to educational and social equalities’ (Facer & Selwyn, Citation2021, p. 1). Some of this hard work has begun to materialise in a range of locations, showing how the generative concepts described above are being deployed and tested as resources for making new forms of sociodigital futures in education.

One example, based on an explicit digital sovereignty model, is the Democratic Digitalisation Program for Education in the Catalan region of Spain. Led by the group Xnet in partnership with Barcelona City Council, its Proposal for a Democratic and Sovereign Digitalisation of Europe, published by the Publication Office of the European Union, outlined an approach to the ‘design of pedagogies and Digital School Strategies from an innovative, agile, humanistic and respectful perspective with human rights’.Footnote4 The programme is informed by critical debates and issues concerning ‘information management, data sovereignty and privacy, algorithms and discriminatory biases, Big Tech oligopolies, net neutrality, open knowledge and education, digital infrastructures, digital environmental impact’, and explicitly ‘moving away from technophobia, technodeterminism or instrumental reductionism’. Its outputs include the creation of a package of open-source computer applications for use by schools in Barcelona and further afield in Europe.Footnote5 Another output is a draft ‘declaration’ consisting of commitments to human rights and democracy ‘as the foundation and horizon of digitalisation in education’, which underpins the building of new digital infrastructure for schooling created using open-source applications:

digital infrastructures have emerged as the foundation of the information market and of new types of power. Everything that surrounds them – their ownership, their development, the expert knowledge of their use, the use itself, their location, etc. – must therefore be the focus and starting point of any public policy, as the digital rights of the educational community and democratic digital education come into play.Footnote6

It has built in clear commitments to maintaining students’ ‘digital rights’ through democratic participation from the outset. Its arrangement of open infrastructure represents in prototype form what Davis et al. (Citation2021) might describe as a critical algorithmic reform that foregrounds proactive reparation through design rather than reactive regulatory safeguards and frameworks. It is an important attempt to challenge dominant EdTech imaginaries and infrastructures. Rather than treating sociodigital futures of education as determined by technical arrangements, it emphasises alternative educational futures in the making informed by issues of sovereignty and democratisation, and by critical algorithmic reform.

As another example, the UK-based group Data, Tech and Black Communities (DTBC) focuses on how technology can be designed with empathy and justice commitments, with an emphasis on challenging how EdTech can ‘reproduce and potentially deepen existing inequalities, especially for those affected by both digital poverty and structural racism’ (DTBC Citation2021). Through community organising and public-facing initiatives, the DTBC aims to:

build community with people interested in understanding how Edtech is being used in our primary schools; help develop community research skills in both quantitative and qualitative data collection and analysis in order to better understand what is going on; shed light on what Edtech is being used in our schools, which companies are selling them and understand what they are doing with our children’s data; understand the extent and impact of digital exclusion on children living in economically deprived areas.Footnote7

As such, the DTBC prioritises democratic community participation in researching EdTech within contexts of marginalisation and fostering ways to collectively organise against EdTech’s effects as an engine for the reproduction of exclusion, inequality and injustice. Such work, arguably, is infused with an ethics of care, directing attention to the ‘situated vulnerability’ of particular children and systematically working against neglect, mistreatment or disregard of these young people.

As another example, in the US, the Civics of Tech Project, established by critical educational technology academics, provides guidance, resources and curriculum materials ‘to empower students and educators to critically inquire into the effects of technologies on their individual and collective lives’.Footnote8 A research paper by the organisers argues that EdTech’s entanglements with Western educational psychology and Big Tech constrain the imagination of the field, and offers three alternative frames – collective, critical, and ecological – to re-envision educational technology research and the possibilities of educational technology in learning and society (Heath et al., Citation2023). This work is informed by feminist, Indigenous traditions and anticolonial approaches, critical race theory, and ecological perspectives on the environmental impacts of technology, offering a range of research approaches such as storytelling and place-based methodologies that might produce new understandings and thus contribute to different ways of imagining and making EdTech.

Concluding thoughts: reorienting research for alternative sociodigital futures

We opened this paper by arguing that there are growing concerns about the future trajectories promised by the EdTech industry; trajectories that risk futures of intensified inequalities, thoughtlessness, carelessness in relationships, enclosure and extraction. Current responses to these developments are currently inadequate to challenge these trajectories. Neither identifying the problems nor seeking to regulate them in retrospect constitutes the work needed to create just, democratic and caring sociodigital practices in education.

Instead, we suggest that there are a set of key concepts and practices that are emerging as sources to help imagine and build potential alternative practices. Without overclaiming or engaging in our own form of techno-optimism, we want to argue that even as the EdTech industry seeks to reframe education as a site of data extraction, other trajectories are possible. This claim makes a set of demands of researchers in this field.

First, the conceptual work being conducted by critical data studies, Indigenous social movements and feminist theorists and activists draws our attention to four key concepts: algorithmic reparations, data sovereignty, care and democratisation. These concepts invite the following questions:

  • How can sociodigital practices enable reparations, and what forms can they take?

  • How can principles of Indigenous Data Sovereignty be respected in their own terms, and become foundational to frameworks for data justice more broadly?

  • What sociodigital practices can be developed that centre care for, and care in, education?

  • If futures of education are to be democratically made, rather than delegated by capitalist EdTech, what kinds of sociodigital practices should we envisage and invest in?

Second, in exploring these questions, we argue that we do not need to start from scratch but, proceeding from the position of abundance (Gibson-Graham, Citation2008), we can already see practical examples with which we can create common cause. These projects, as well as the key concepts of reparation, sovereignty, care and agonistic democracy, however, call for an active, engaged orientation to research, one that is informed by and learns through practical experimentation about the sorts of futures that might be possible. Reparation, for example, demands the intentional effort to address and care for historic and present inequalities; care demands the ongoing attention to situated vulnerability; agonistic democracy demands the surfacing through data activism of alternative accounts of the world; data sovereignty demands an active remaking of governance practices and regulation. There is no fixed set of outcomes stemming from this work, but we suggest the uncertainty – and multiple futures it opens up for EdTech – is productive and necessary.

This demand for active and engaged intervention is manifest in the examples we have discussed. The Civics of Tech approach, for example, is grounded in collective, critical and ecological approaches and methodologies, and offers a model for reimagining EdTech research. The DTBC’s efforts exemplify the practice of coproduction and organising with community groups as a means of investigating the lived experiences of EdTech and the exclusions it entails. Examples of engaged research such as this both illuminate the tensions between the anticipated futures that are engineered into EdTech products and the actual experiences of students; and point towards alternative designs for other sociodigital futures of education. They also point towards the importance of working with communities, as partners in sociodigital practices in education, to generate the public matters of concern that provoke the controversies necessary for the emergence of technical democracies. Collaborative research in this field, therefore, may comprise not only the design-based research familiar from concepts of ‘design justice’ (Macgilchrist et al., Citation2023), but begin to generate practices of public controversy around the current EdTech trajectories. Our challenge may be precisely to begin to unsettle and render strange the assumptions that education constitutes a site for data extraction and speculation, and a site of thoughtless, careless enclosure of public goods. The forms that such interventions may take are, as yet, unclear, but we may want to learn from wider successful social movements for democracy and care in their appropriation of interventionist and participatory arts methods and tools for public engagement (e.g. Fremeaux & Jordan, Citation2021).

Finally, both the key concepts that we have identified, and practices that we see arising in response to the EdTech industry, invite us to rethink another aspect of engaged sociodigital research. They show us that our imperative may not be ‘merely’ to design new systems, but also to find ways to dismantle old systems in which extraction, harm and inequalities are embedded. Our ethical imperative, therefore, is not only to invent alternatives but also to ‘hospice’ (De Oliveira, Citation2021) existing harmful sociodigital systems and practices, so that something new can emerge.

Our argument in this paper, therefore, is that the current situation requires more than critique and regulation. It demands instead active, engaged research in partnership with the communities currently facing harm from dominant EdTech practices, a partnership that will attend not only to opening up and creating alternatives but to destabilising and hospicing existing sociodigital practices of injustice.

Acknowledgements

The support of the Economic and Social Research Council (ESRC) is gratefully acknowledged. Grant Ref ES/W002639/1.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Economic and Social Research Council.

Notes on contributors

Arathi Sriprakash

Arathi Sriprakash is Professor of Sociology and Education at the University of Oxford. She is Principal Investigator of the Reparative Futures of Education project (www.repair-ed.uk) and co-investigator of the ESRC Centre for Sociodigital Futures.

Ben Williamson

Ben Williamson is a senior lecturer at the Centre for Research in Digital Education, University of Edinburgh and a co-editor of Learning, Media and Technology.

Keri Facer

Keri Facer is Professor of Educational and Social Futures at Bristol University, Professor of Public Education at Black Mountains College and Visiting Professor at the Swedish Agricultural University SLU Uppsala. She is an interdisciplinary researcher with particular interest in the temporal imagination and the role of the future in structuring perceptions of possibility, particularly in sites of education and informal learning. She leads the British Academy Programme ‘Times of a Just Transition’, is Co-Investigator at the ESRC Sociodigital Futures Centre, and is working with the Joseph Rowntree Foundation and Black Mountains College on the ecological imagination.

Jessica Pykett

Jessica Pykett is Professor of Social and Political Geography and Co-Director of the Centre for Urban Wellbeing at the University of Birmingham, UK. She is the author of Brain Culture: Shaping Policy through Neuroscience. Her research has looked at how neuroscientific and behavioural research is influencing policy and governance – from public health, architecture and urban design to economic theory, education, workplace training and wellbeing. She is currently Principal Investigator on the Ethics and Expertise Beyond Times of Crisis project and Co-Investigator at the Centre for Sociodigital Futures, both funded by UK Research and Innovation.

Carolina Valladares Celis

Carolina Valladares Celis’ research critically explores the integration of digital technologies in education. Her work is concerned with making forms of dominance visible and exploring structural inequalities affecting educational processes. Over the past decade, she has carried out research with various stakeholders across all levels of formal education and has engaged in academic work involving public forms of pedagogy in Latin America and the UK. Her current work explores how the future of education is imagined and acted upon in the present, as well as who participates (or not) in designing those imaginaries. Ultimately, her research is interested in surfacing alternatives for fairer and sustainable futures.

Notes

References

  • Ada Lovelace Institute. (2022). Algorithmic accountability for the public sector. https://www.adalovelaceinstitute.org/report/algorithmic-accountability-public-sector/
  • Agrawal, H. (2022). How can technology mitigate inequity in education systems? https://elearningindustry.com/how-can-technology-mitigate-inequity-in-education-systems
  • Bonini, T., & Treré, E. (2024). Algorithms of resistance: The everyday fight against platform power. MIT Press.
  • Callon, M., Lascoumes, P., & Barthe, Y. (2009). Acting in an uncertain world: An essay on technical democracy. MIT Press.
  • Crooks, R. (2021). Productive myopia: Racialized organizations and EdTech. Big Data & Society, 8(2), 205395172110504. https://doi.org/10.1177/20539517211050499
  • Crooks, R., & Currie, M. (2021). Numbers will not save us: Agonistic data practices. The Information Society, 37(4), 4, 201–213. https://doi.org/10.1080/01972243.2021.1920081
  • Davies, H., Eynon, R., Komljenovic, J., & Williamson, B. (2022). Investigating the financial power brokers behind EdTech. In S. Livingstone & K. Pothing (Eds.), Education Data Futures: Critical, Regulatory, and Practical Reflections. Digital Futures Commission, 5 Rights Foundation: https://educationdatafutures.digitalfuturescommission.org.uk/essays/competing-interests-in-education-data/investigation-financial-power-brokers-edtech
  • Davis, J. L., Williams, A., & Yang, M. W. (2021). Algorithmic reparation. Big Data & Society, 8(2). https://doi.org/10.1177/20539517211044808
  • De Oliveira, V. M. (2021). Hospicing modernity: Facing humanity’s wrongs and the implications for social activism. North Atlantic Books.
  • DTBC – Data. Tech and black communities (2021) is EdTech really improving outcomes for marginalised children?. https://medium.com/data-tech-black-communities/is-edtech-really-improving-outcomes-for-marginalised-children-53fbbd5c9c2
  • Duncan, G. (2005). Critical race ethnography in education: Narrative, inequality and the problem of epistemology. Race, Ethnicity and Education, 8(1), 93–114. https://doi.org/10.1080/1361332052000341015
  • Escobar, A. (2018). Designs for the pluriverse: Radical interdependence, autonomy, and the making of worlds. Duke University Press.
  • Facer, K., & Selwyn, N. (2021). Digital technology and the futures of education –towards ‘non-stupid’ optimism. Paper commissioned for the UNESCO Futures of Education report. https://unesdoc.unesco.org/ark:/48223/pf0000377071
  • Fremeaux, I., & Jordan, J. (2021). We are ‘nature’ defending itself: Entangling art, activism and autonomous zones. Pluto Press.
  • Gibson-Graham, J. K. (2008). Diverse economies: Performative practices for ‘other worlds. Progress in Human Geography, 32(5), 1–20. https://doi.org/10.1177/0309132508090821
  • GIDA – Global Indigenous Data Alliance. (2022, December 12). https://www.gida-global.org/
  • Gulson, K. N., Sellar, S., & Webb, P. T. (2022). Algorithms of education: How datafication and artificial intelligence shape policy. University of Minnesota Press.
  • Gulson, K. N., & Witzenberger, K. (2022). Repackaging authority: Artificial intelligence, automated governance and education trade shows. Journal of Education Policy, 37(1), 145–160. https://doi.org/10.1080/02680939.2020.1785552
  • Hakimi, L., Eynon, R., & Murphy, V. (2021). The ethics of using digital trace data in education: A thematic review of the research landscape. Review of Educational Research, 91(5), 671–717. https://doi.org/10.3102/00346543211020116
  • Heath, M. K., Gleason, B., Mehta, R., & Hall, T. (2023). More than knowing: Toward collective, critical, and ecological approaches in educational technology research. Educational Technology Research & Development, 1–23. https://doi.org/10.1007/s11423-023-10242-z
  • Holloway, J., Lewis, S., & Langman, S. (2022). Technical agonism: Embracing democratic dissensus in the datafication of education, learning, media and technology. Learning, Media and Technology, 48(2), 253–265. https://doi.org/10.1080/17439884.2022.2160987
  • Knox, J. (2022). (Re)politicising data-driven education: From ethical principles to radical participation. Learning, Media and Technology.
  • Krutka, D. G., Smits, R. M., & Willhelm, T. A. (2021). Don’t Be evil: Should we use Google in schools? Tech Trends, 65(4), 421–431. https://doi.org/10.1007/s11528-021-00599-4
  • Ladson-Billings, G. (2006). From the achievement gap to the education debt: Understanding achievement in US schools. Educational Researcher, 35(7), 3–12. https://doi.org/10.3102/0013189X035007003
  • Macgilchrist, F. (2019). Cruel optimism in edtech: When the digital data practices of educational technology providers inadvertently hinder educational equity. Learning, Media and Technology, 44(1), 77–86. https://doi.org/10.1080/17439884.2018.1556217
  • Macgilchrist, F., Allert, H., Cerratto Pargman, T., & Jarke, J. (2023). Designing postdigital futures: Which designs? Whose futures? Postdigital Science & Education, 6(1), 13–24. https://doi.org/10.1007/s42438-022-00389-y
  • Martin, A., Myers, N., & Viseu, A. (2015). The politics of care in technoscience. Social Studies of Science, 45(5), 625–641. https://doi.org/10.1177/0306312715602073
  • McQuillan, D. (2022). Resisting AI: an anti-fascist approach to artificial intelligence. Bristol University Press.
  • McQuillan, D. (2023, June). Predicted benefits, proven harms: How AI’s algorithmic violence emerged from our own social matrix. The Sociological Review Magazine, 6. https://doi.org/10.51428/tsr.ekpj9730
  • Moreton-Robinson, A. (2015). The white possessive: Property, power, and indigenous sovereignty. University of Minnesota Press.
  • Mouffe, C. (1999). Deliberative democracy or agonistic pluralism? Social Research, 66(3), 745–758.
  • OECD. (2021). OECD digital education outlook 2021. Pushing the frontiers with artificial intelligence, Blockchain and robots. https://doi.org/10.1787/589b283f-en
  • Oremus, W. (2020, June 26). 5 ideas to make silicon valley less racist. https://onezero.medium.com/five-ideas-to-make-silicon-valley-less-racist-7a1069ad05f8
  • Perrigo, B. (2023, January 18). OpenAI used Kenyan workers on less than $2 per hour to make ChatGPT less toxic. Time. https://time.com/6247678/openai-chatgpt-kenya-workers/
  • Perrotta, C. (2022). Advancing data justice in education: Some suggestions towards a deontological framework. Learning, Media and Technology. https://doi.org/10.1080/17439884.2022.2156536
  • Prehn, J. (2022, June). Indigenous data sovereignty and education. The digitalisation of Education. NORRAG policy insights #01. https://resources.norrag.org/resource/719/the-digitalisation-of-education.38–39 accessed 30/12/2022.
  • Puig de la Bellacasa, M. (2011). Matters of care in technoscience: Assembling neglected things. Social Studies of Science, 41(1), 85–106. https://doi.org/10.1177/0306312710380301
  • Rudolph, S. (2019). Unsettling the gap: Race, politics and indigenous education. Peter Lang.
  • Selwyn, N. (2022). Less work for teacher? The ironies of automated decision-making in schools. In Pink, S., Berg, M., Lupton, D., & Ruckenstein, M. (Eds.), Everyday automation: Experiencing and anticipating emerging technologies. Taylor & Francis.
  • Selwyn, N. (2023). Digital degrowth: Toward radically sustainable education technology, learning. Media and Technology. https://doi.org/10.1080/17439884.2022.2159978
  • Selwyn, N., Henderson, M., & Chao, S.-H. (2015). Exploring the role of digital data in contemporary schools and schooling —’200,000 lines in an Excel spreadsheet’. British Educational Research Journal, 41(5), 767–781. https://doi.org/10.1002/berj.3186;http://www.jstor.org/stable/24808118
  • Selwyn, N., Hillman, T., Bergviken-Rensfeldt, A., & Perrotta, C. (2023). Making sense of the digital automation of education. Postdigital Science and Education, 5(1), 1–14. https://doi.org/10.1007/s42438-022-00362-9
  • Smith, L. T. (2021). Decolonizing methodologies: Research and indigenous peoples. Bloomsbury Publishing.
  • Sriprakash, A. (2022). Reparations: Theorising just futures of education, discourse: Studies in the cultural politics of education. Discourse: Studies in the Cultural Politics of Education, 44(5), 782–795. https://doi.org/10.1080/01596306.2022.2144141
  • Swist, T., & Gulson, K. N. (2023). Instituting socio-technical education futures: Encounters with/through technical democracy, data justice, and imaginaries. Learning, Media and Technology, 48(2), 181–186. https://doi.org/10.1080/17439884.2023.2205225
  • Táíwò, O. O. (2022). Reconsidering reparations. Oxford University Press.
  • Thompson, G., Gulson, K., Swist, T., & Witzenberger, K. (2022). Responding to sociotechnical controversies in education: A modest proposal toward technical democracy. Learning, Media and Technology, 48(2), 240–252. https://doi.org/10.1080/17439884.2022.2126495
  • UNESCO. (2023). An ed-tech tragedy? Educational technologies and school closures in the time of COVID-19. https://unesdoc.unesco.org/ark:/48223/pf0000386701
  • Valle, F. S. (2023). In Defense of Solidarity and pleasure: Feminist technopolitics from the global south. Stanford University Press.
  • Veale, M. (2022). Schools must resist big EdTech – but it won’t be easy. In In S. Livingstone & K. Pothong (Eds.), Education data futures: Critical, regulatory and practical reflections. Digital Futures Commission, 5Rights Foundation: https://educationdatafutures.digitalfuturescommission.org.uk/essays/competing-interests-in-education-data/schools-must-resist-big-edtech
  • Walter, M., Lovett, R., Maher, B., Williamson, B., Prehn, J., Bodkin-Andrews, G., & Lee, V. (2021). Indigenous data sovereignty in the era of big data and open data. The Australian Journal of Social Issues, 56(2), 143–156. https://doi.org/10.1002/ajs4.141
  • Whitman, M. (2020). “We called that a behavior”: The making of institutional data. Big Data & Society, 7(1), 2053951720932200. https://doi.org/10.1177/2053951720932200
  • Williams, A., Miceli, M., & Gebru, T. (2022, October 13). The exploited labor behind artificial intelligence. Noema. https://www.noemamag.com/the-exploited-labor-behind-artificial-intelligence/
  • Williamson, B. (2019). Datafication of education: A critical approach to emerging analytics technologies and practices. In Beetham, H. & Sharpe, R. (Eds.), Rethinking pedagogy for a digital age: Principles and practices of design. Routledge.
  • Williamson, B., Gulson, K. N., Perrotta, C., & Witzenberger, K. (2022). Amazon and the new global connective architectures of educational governance. Harvard Educational Review, 92(2), 231–256. https://doi.org/10.17763/1943-5045-92.2.231
  • Williamson, B., & Komljenovic, J. (2022). Investing in imagined digital futures: The techno-financial ‘futuring’ of edtech investors in higher education. Critical Studies in Education, 64(3), 234–249. https://doi.org/10.1080/17508487.2022.2081587
  • World Bank. (2020). Reimagining human connections: Technology and innovation in education at the World Bank.
  • Yu, J., & Couldry, N. (2022). Education as a domain of natural data extraction: Analysing corporate discourse about educational tracking. Information, Communication & Society, 25(1), 127–144. https://doi.org/10.1080/1369118X.2020.1764604
  • Zakharova, I., & Jarke, J. (2022). Educational technologies as matters of care. Learning, Media and Technology, 47(1), 95–108. https://doi.org/10.1080/17439884.2021.2018605