1,032
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Contexts and dimensions of algorithm literacies: Parents’ algorithm literacies amidst the datafication of parenthood

ABSTRACT

In this paper, I present contextualizing factors, dimensions, and key markers of algorithm literacies, paying attention to the context of parenting and parenthood amidst datafication. Analyzing data from “think-aloud” interviews with 30 parents of children aged between 0 and 18, across England, I draw upon media and digital literacies scholarship to focus, first, in this paper, on the competencies, conversations, and events which contextualize parents’ literacies with algorithmic interfaces. Next, I draw out four dimensions of parents’ algorithm literacies including algorithm awareness, technical competencies, critical capacities, and championing their and their children’s best interests, identifying practical markers for each dimension. I reflect on the broader implications of these for parenting and parenthood in datafied societies, and note that algorithm literacies are, forever, a work in progress, in fluidity and flux across the diverse courses of parenting journeys, deeply contextualized in the resources and restraints that parents encounter in their daily lives.

Liam is a secondary school teacher in the south of England, and the father of a toddler and an infant. As he searches for help with his toddler’s vomiting and sickness, he tells me that, if something he needs, for a spot of parenting advice and information, does not come up on the first few pages of a Google search, it is probably because “the algorithm doesn’t like it.” When I ask Liam what he means by “the algorithm” and how the algorithm might shape what he sees on toddlers vomiting depending on whether they “liked” it or not, Liam points out to me that he often prefers to use Duck Duck Go, rather than Google so that what he searches for can stay untracked. He explains to me then, with a high amount of confidence, where search data goes, and how the journeys of search data might shape the stories, adverts and posts he sees on a wide range of other platforms. Whilst Liam’s technical expertise on turning tracking on or off, comparing Google and Duck Duck Go (see here Parsania, Kalyani, & Kamani, Citation2016) or indeed, his critical competencies in figuring out why his data journeys matter (Buckingham, Citation2007; Pangrazio & Sefton-Green, Citation2021) is of great interest to me, I probe him further on when he started thinking about such matters, and when and how he might have figured out that “the algorithm doesn’t like” some things.

I’m quite left wing and but at the same time I’m aware of a spectrum as well, but I think that. I’ve been switched on .me personally to technology in general since a very young age. Being a millennial like I’ve grown up. the Internet’s grown up with me. I had a Windows 95 growing up. and then as soon as I heard about the algorithm and it tracks your usage by blah blah blah. I think as soon as I heard about that, which was probably about 7 years ago … .it might have been a friend but yeah it felt like something anecdotal that yeah like oh, by the way, Google does this so turn it off.

Liam, father of a 2 year old and an infant, south of England

Liam, and indeed other parents I spoke to, often explained their technical tactics with algorithms and personal data as having come from somewhere – a conversation with friends, a shared YouTube link from a cousin, or sometimes, pondering about what an “overly cynical” ex-partner or current partner says about such things. These amalgamations of small maneuvers, semi-certain notions about algorithms, broader agendas of what parents do or do not ever do in terms of their children’s digital data – all have histories, as Liam notes, and represents what Livingstone and Blum-Ross (Citation2020) as rooted in parents’ own life stories and pasts, as much as their musings about today and tomorrow.

Locating algorithm literacies within the media and digital literacies conversation

In this paper, I draw upon a broader project – Parents talking algorithms – to consider parents’ algorithm literacies as part of their broader digital practices in an algorithm age. I locate this work within the context of a vibrant arena of research on algorithm literacies (see the excellent overview and agenda presented by Oeldorf-Hirsch & Neubaum, Citation2023) including empirical work on users’ experiences of algorithms (Bishop, Citation2019; Ytre-Arne & Moe, Citation2021; Siles, Citation2023). Here, I recognize algorithm literacies as part and parcel of a broader concept pool (Markham, Citation2020) including data and platform literacies (DeVito, Citation2021) necessitated amidst datafication. I approach algorithm literacies not separately from, but as a nestedFootnote1 component within the broader media and digital literacies conversation (Buckingham, Banaji, Carr, Cranmer, & Willett, Citation2005; Livingstone, Citation2004), with its longstanding emphasis on understanding, awareness, technical skills and critical capacities. Indeed, it is critical to bear in mind that my attention to parents’ algorithm literacies is not intended as a replacement for, or a diversion from institutional responsibilities, around platforms and the power of algorithms. Media and digital literacies research has linked literacy to the notion of legibility (Das, Citation2011; Livingstone, Citation2008), where illegible interfaces, and by extension, harmful, and damaging interfaces must not escape attention amidst the attention to literacies. As Pangrazio and Selwyn (Citation2019) note, the focus on data, digital or algorithm literacies, are only likely to be partially effective unless “seen as one element amongst others” (p 209). Equally important to note here that any attention to algorithm literacy, best treated perhaps as its plural form – literacies, given the diversity of practices – is not positioned as a replacement or alternative for other digital literacies. Data literacies, AI literacies, algorithm literacies, for instance, share foundational critical and normative alignments with media and digital literacies for instance. Despite these broader, and more longstanding roots within media and digital literacies, datafication demands, DeVito argues in an exposition of algorithmic literacy (2021), a set of new(er) literacies, where algorithm literacies are components of a new set of platform literacies, keeping up with the rapidity of technological development. Particularly for parenthood, I see the scope of algorithm literacies as incorporating − 1) parents’ awareness (Gran, Booth, & Bucher, Citation2021) of the presence of algorithms, 2) parents’ technical competencies parents’ critical capacities (Snyder & Beavis, Citation2004) to make sense of what algorithms represent and 4) parents’ abilities to champion their children’s best interests including mentoring, brokering and shaping (c.f. Livingstone & Blum-Ross, Citation2020) the role of algorithms in their children’s lives. Whilst the last dimension might at first sight appear to be outside of the scope of any definition of algorithm literacies, it incorporates an element beyond advocating and engaging for oneself in relation to media and technologies, and extends to one’s abilities of doing so for others. Indeed, this aspect relates closely to the civic dimensions of many expositions of critical media and digital literacies (c.f. Buckingham, Citation2007; Polizzi, Citation2021). As Livingstone and Blum-Ross astutely argue in their work on parenting for digital futures, parents have significant roles in acting as technology mentors for their children (2020). And, this role, matters to parents, and advocating for their children (Alper, Citation2023) is important to them. Every parent in my dataset of 30 parents, instinctively introduced to me their myriad, diverse practices around algorithms in relation to their children when I attempted to find out about their own approaches to algorithms. Whilst many parents drew out distinctions between algorithms and data for themselves versus their children, and outlined often differing stances about these, they took their roles as mentors and shapers seriously. Often, though, their competencies, abilities and practices in relation to these roles, were uneven.

The literature on media and digital literacies (Buckingham, Citation2007; Livingstone, Citation2010; Mihailidis, Citation2018), and then data and algorithm literacies (Fotopoulou, Citation2021; Pangrazio & Selwyn, Citation2019) is clear in its recognition that literacies are far more than button-pushing technical skills, although technical skills and tactics, indeed matter and are meaningful in various ways. Decades of scholarship on media and digital literacies have demonstrated that literacies have histories and contexts, and like all practices, are shaped by a diverse set of often unequal resources. Bhargava underlines in 2015, that people’s literacies with data and algorithms go beyond technical skills to broader questions around abilities to critique and participate in civic, aesthetic and emancipatory purposes. This sits in alignment with the aspirations behind media and digital literacies where new literacies scholarship (Luke, Citation2013; Green, Citation1999; Snyder and Beavis, Citation2001) has consistently argued for attention to critical capacities in relation to texts and technologies. Writing in 2003, about the shift in modalities between page and screen, Gunther Kress draws this distinction between technical tactic and critique, by drawing out attention also to time for reflection. Kress notes – “certainly, the skills of near instant response are essential, though I am not clear whether there is ever time for reflection, for assessment, for the quiet moment of consideration and review” (p 174). Both matter, and one shapes the other, as conversations with the parents revealed to me time and again, as we spoke about their diverse understandings and practices with algorithms. As I pay attention to parents’ algorithm literacies, I draw from media and digital literacies scholarship then, to stay away in this paper, from any attempts to foreground or privilege technical skills over critical practices (Snyder & Beavis, Citation2004; Buckingham, Citation2006; 2010). As Livingstone and Blum-Ross remind us (Livingstone, Stoilova, & Nandagiri, Citation2020), parents’ own interests and expertise, their life, histories, and their memories, and the interests of the children have a role to play in the ways in which parents approach technology. This means not conceptualizing parents as individuals alone, operating and isolated bubbles, but thinking relationally, of parents in relation to others in their household, in relation to their contexts, in relation to other parents, and of course, in relation to institutions. This qualitative attention to parents’ histories, memories, and experiences, is, as McCusker suggests in (McCosker, Citation2017), an important “way of finding the fault lines, misuses, the conflict and contradictions, moments of resistance or reconfigurations”.

Attention to parents’ digital literacies has often had to do with parents’ roles in sharing content about their children online (Siibak & Traks, Citation2019; Livingstone & Blum-Ross, 2017). There is concern in this area from scholars, around parents’ digital storytelling behaviors as Barnes and Potter (Citation2021) note. Arguably, from this perspective, parents’ algorithm literacies matter for children. But I suggest that they matter even beyond the context of sharenting alone, because they shape the visions of parenthood parents see when they engage with an algorithmically curated timeline on newsfeed, or when they stumble upon news or content relating to perceived and real risks repeatedly on curated news content on aggregator sites, and the ways in which they are able to champion their own and their children’s best interests. Algorithm literacies matter when parents respond to the myriad invitations which come their way through recommendations, and when they go online in quests for information, advice and support for anything from a late night fever to long-standing emotional needs or help with nouns and adverbs for a spot of homework. Indeed, as Gran and colleagues argue in 2021 algorithm awareness potentially corresponds to a new digital divide. Their research in the highly digitized context of Norway shows, variation and difference in relation to citizens, algorithm awareness. Their algorithm awareness typology of the unaware, the uncertain, the formative, the neutral, the skeptic, and the critical invites us, in the context of parenthood to consider the implications of such wide variations in algorithm awareness and algorithm literacy is for parenthood in contemporary digital and datafied societies, and for children, parents, and families. Livingstone and colleagues, in thinking about privacy literacy (2021), draw attention to a distinction between people’s literacies about interpersonal privacy, in terms of how personal data is created, accessed, and made sense of in interpersonal connections, institutional privacy in terms of how public institutions and bodies handle data and commercial privacy in terms of the harvesting of data by commercial enterprise. This is a key distinction to bear in mind when thinking about algorithm literacies because all three dimensions are significant in the context of parenthood. This project for instance has considered parents’ musings about algorithms in the context of timelines, for instance where the interpersonal is key to shaping the visions of others’ parenthood which cluster on parents’ timelines and which shape how parents feel, as many explained to me in our interviews. But equally parents have spoken about recommendations and their responses to recommendations as invitations in an ever digital parenthood which has shed particular light on the commercial aspects in parents as consumers in relation to their children (c.f. Le‐Phuong Nguyen, Harman, & Cappellini, Citation2017) and parents have also spoken about the institutional in terms of algorithms and data driven automation in the public domain (see Kaun, Citation2022). Scholarship on media and digital literacies and indeed algorithm literacies (Cotter, Citation2020) draws attention to the talk around algorithms. Here, anecdotes about knowledge-sharing, shared links, hearsay – the cultural resources behind literacies (Caronia, Citation2009), the folk theories about algorithms (Siles, Segura-Castillo, Solís, & Sancho, Citation2020; Toff & Nielsen, Citation2018; Ytre-Arne & Moe, Citation2021) matter. The gossip about algorithms matters as Sophie Bishop (Citation2019) draws out astutely in her work on algorithmic gossip, and as DeVito unpacks carefully as exogenous and endogenous sources of knowledge in people’s developing algorithm literacies (DeVito, Citation2017). Parents’ algorithm literacies connect to parents’ contexts and histories, as one of the foundational ideas in Livingstone and Blum-Ross’s work on parenting for digital futures (Livingstone, Stoilova, & Nandagiri, Citation2020) indicate. They argue that we must interpret parents’ perspectives, roles and personal philosophies as work in flux, between generations, where parents carry their own histories, contexts and the full complexity of their own past and present trajectories, into their own roles as parents.

In this paper, I present contextualizing factors, dimensions and key markers of parents’ algorithm literacies in a broader context of parenting and parenthood amidst datafication. Analysing data from “think-aloud” interviews with 30 parents of children aged between 0 to 18, across England, I draw upon media and digital literacies scholarship to focus in this paper, on the competencies, conversations and events which contextualize parents’ literacies with algorithmic interfaces. I draw out four dimensions of parents’ algorithm literacies including algorithm awareness, technical competencies, critical capacities, and championing their and their children’s best interests, identifying practical markers for each dimension. I reflect on the broader implications of these for parenting and parenthood in datafied societies, and note that algorithm literacies are, forever, a work in progress, in fluidity and flux across the diverse courses of parenting journeys, deeply contextualized in the resources and restraints parents encounter in their daily lives.

Method

Around England, I conducted in-depth “think-aloud” interviews with 30 parents (see Charters, Citation2003; Leighton, Citation2017; Swart, Citation2021). Over the course of an hour-long conversation with me, parents clicked through their most used platforms and websites, to think aloud about their regular parenting activities on search engines, entertainment platforms, social media feeds, and more (see Das, Citation2023). On the one hand, I looked at the intersections of parenting and parenthood domains, which were the focus of well-established ideological debates (C.f. Lee, Bristow, Faircloth, & Macvarish, Citation2014) in the sociology of parenting, and a variety of specific algorithmic interfaces – search engines, social media timelines, recommendation systems, news aggregators and apps – the user responses to which have been central to user-centric algorithm studies. Like Gruber & Hargittai (Citation2023), myfocus was never on evaluating technical or algorithmic abilities. I instead concentrated on openly exploring the role of algorithms across multiple areas of parenthood with an emphasis on the imaginations, expectations, possibilities, and feelings forming around algorithms (Bucher, Citation2017). As algorithms are not texts in the sense of texts in audience reception studies, as Lomborg and Kapsch (Citation2020) point out in their work on decoding algorithms, I was mindful of discussions within user-centric algorithm studies about how to speak of and describe algorithms in the actual doing of field work on users negotiating algorithmic interfaces. This needed to take into account the recursive relationship between algorithms and use (Gillespie, Citation2014). This links to what Siles, Espinoza-Rojas, Naranjo, and Tristán (Citation2019) call mutual domestication (see also Rader & Gray, Citation2015; Eslami et al, Citation2015), where, as Gruber and Hargittai (Citation2023) recently demonstrate, people's awareness of the collection and analysis of personal data shapes the ways in people feel able to understand and engage with algorithmic interfaces. I carefully considered whether to introduce the term “algorithm” – or any other technical terms – up front or to leave room for participants to potentially discuss algorithms in terms of their understandings. In the end, like many others before me (Karizat, Delmonaco, Eslami, & Andalibi, Citation2021; Swart, Citation2021), I leaned heavily toward the latter and chose not to centralize the term “algorithm” itself in the interview process.

Thirty parents from across England participated in my think-aloud interviews, comprising 15 mothers and 15 fathers, with a quarter of the total number of parents being from a minority ethnic background. The parents I spoke with ranged in age from their 20s to their 50s, but they all had children who were between the ages of 0 and 18 at the time of our discussion. As a result, I got the opportunity to speak with parents at a variety of parenting phases, from those who had a newborn to those whose children were getting ready to depart for college or other opportunities. I talked to parents from the south, the north, and various places in between, including both rural and urban areas of England. They held a variety of jobs, including those in the professional services, education, health care, social services, transport, marketing, and a number of other areas. Some of them were stay-at-home parents or ran small companies online. The University of Surrey’s comprehensive University Ethics Committee ethical assessment procedure was applied to the study (redacted). All names have been changed to pseudonyms with the participants’ informed consent .

Table 1. Participant Profile.

Contextualising parents’ algorithm literacies

Parents brought a diversity of contexts and an uneven set of contextual resources into our conversations about algorithms. Their awareness, understanding, technical skills and critical competences around algorithms varied by platform, and even the particular stage in parenthood they were at, or how able they felt to act on their knowledge and awareness and more. Often, parents had a broad generic idea of datafication in contemporary societies, with little mechanical or technical understanding. And equally often they had high amounts of technical understanding of one or two specific components of a particular platform, but little else perhaps in other contexts. In what follows I draw attention first to the various personal and professional competencies parents bring in to their interfaces with algorithms. These competencies, are not necessarily, if at all, solely technical or to do with the mechanics of algorithmic systems, and draw more widely upon professional roles which involve learning or teaching about technologies in general or perhaps even more broadly about being critical in evaluating what one sees in a mediated world. I then pay particular attention to knowledge sharing, gossip, and informal talk inside households, between parents, conversations with colleagues and the various informal sources of their knowledge, where the sharing of theories about algorithms (Siles, Segura-Castillo, Solís, & Sancho, Citation2020), or gossip (Bishop, Citation2019) between fellow parents have a shaping role on their algorithm literacies. In this context of shared knowledge, and uneven competencies brought to their interfaces with algorithms, I note also, that for many parents, there was often a moment when “something happened”, so to speak. These moments of something happening, might appear fleeting, or insignificant, for they are different, it seemed, from large, disruptive or identity-changing life events for example, but these small digital events – when something happened, in parents’ words – changed the ways in which they approached algorithms.

Competencies from elsewhere

Mehmet is a father of two children, aged 11 and 9. He is a secondary school teacher who says he is unnerved by algorithmic filtering and curation of his feeds, and the striking similarities between his interests and the recommendations he stumbles upon. He asserts to me in conversation, that he is not an engineer, but that he “talks about these things” with his school children. Sometimes, if he feels they “absolutely fine” with certain aspects of platform power, he reminds them that he isn’t, or perhaps that he finds these a “little bit intrusive.” Mehmet says –

I’m slightly unnerved by it, and I talk about this with my school children … I’ve been a teacher 20 years and you know, I I’m aware of that. So I can challenge that in class With discussion. So that’s interesting … .

Mehmet, father of an 11 year old and a 9 year old, north of England

In speaking to his own children at home, about data, platforms and algorithms, then, Mehmet says he draws upon some his work as a teacher. This relates, particularly to two key dimensions of parents’ algorithmic literacies I alluded to earlier – parents’ critical capacities around algorithms and their shaping power, and their roles as mentors and brokers (c.f. Livingstone & Blum-Ross, Citation2020) in shaping their and their children’s best interests in an algorithm age. Mehmet says about his own children and how he sees to it that their searches on Alexa remain fairly “bland” –

How do they choose what the what the put on Alexa? It got me thinking, but it’s quite the information that they provides quite often that they save the source of it, don’t they? So according to according to, I don’t know source X or whatever - it’s quite bland as well. It’s quite short and quite bland.

Mehmet’s unpacking of the journeys of search data, including voice assistants, ties into his broader skepticism around intrusive interfaces (Mollen & Dhaenens, Citation2018), and he speaks lucidly about having these conversations with the age group he teaches at school, and indeed raises, at home. His professional competencies as a teacher of secondary school children comes in, despite his lack of a high degree of mechanical or technical competencies, to resource conversations with children at home and school about technology. Like Mehmet, Patrick too has been a teacher for a while − 15 years when we spoke. He is also involved in the leadership team looking after alternative provision for children with special educational needs. He has two daughters, aged 11 and 10. Patrick teaches ICT, and says he “sort of understands” how search engine algorithms work. Teaching ICT he says informs him that the Google search algorithms can learn things about him.

Google can learn the sort of websites that you would go to. So if you’re if you’re searching a health question, for example, and you tend to go for things like the NHS website or, you know, Web MD or something like that. Those searches would tend to crop up towards the top. But if you constantly scroll through and try and find some weird obscure conspiracy theory, they’ll learn that that’s the sort of thing you’re interested in.

Unlike Mehmet, Patrick’s understanding of algorithms and awareness of the presence of algorithms in heightened when he demonstrates a search to me. He also draws a distinction between search topics, but expresses some concern about the potential algorithmic shaping of search results on vaccines for children. Like many parents I spoke to, Patrick is less clear about who “they” is in his comments, and when probed, like other parents, draws attention primarily to companies advertising on Facebook, Instagram or Google – rather than the platforms themselves, and their architecture. He mentions a few times in our conversation stories about teaching ICT to secondary children, and having regular conversations about online safety with them. These resources gained from teaching ICT become part and parcel of Patrick’s own practice when encountering algorithms himself.

Using professional competencies as resources in relatively broad abilities to be algorithm aware, or even, to be critical in assessing content on algorithmic interfaces does not necessarily mean that such knowledge applies in all contexts, across all platforms. Algorithm literacies are by default, fluid, and often context and platform dependent. Audrey, who has children ranging between the ages of 18 and 2, works for a church learning network and works to support churches in their learning and development needs. She has also returned to higher education later in life, having just completed a Masters degree. Audrey tell me that she subjects most of her searches to Google Scholar rather than Google alone because she has worked out that the results are often ranked strangely, showing her what she has already seen before, or clicked on or liked before. But, this awareness does not necessarily translate to her approach for instance to algorithmic curation on newsfeeds, or indeed algorithms at work on Google Scholar. Audrey, whose soon to be adult daughter has made unexpected life choices in Audrey’s view, reported feeling distraught at the sheer volume of proud parent posts about young adults going off to University, where, it was only on specifically reflecting on timelines over the course of our conversation, that she volunteered the possibility that her timeline was perhaps not chronological, telling the story of parenthood as it unfolds in real time, and that it was, perhaps curated in one way or another.

It was clear, from many similar conversations that parents draw from a range of professional and educational competencies, in their interfaces with algorithms, both for themselves, and in relation to their children, but that such drawn knowledge and understanding was not uniform across contexts and platforms. Akemi, a mother of two secondary aged boys speaks to me about her worries around racism, particularly in relation to her children. She encounters numerous videos on YouTube about racially minoritized children being subjected to racist abuse and violence and is distressed that these keep playing, and being recommended to her. Whilst the YouTube recommendation algorithm was opaque to Akemi, and the relationships between her own viewing and the long line of distressing videos was not clear to her, she carefully monitored the content she put out on social media about her own children, because she did not wish them to appear on others’ feeds. When probed about what might make her posts about her children appear on newsfeeds, she drew my attention to the Etsy algorithm which perplexed her as she tried to increase the visibility of her small business online, often unsuccessfully. Uncertain whether Etsy operated in quite the same way as Facebook, she was certain, though, that visibility and invisibility on feeds was opaquely, to her, decided by algorithmic logic.

I’m actually selling things on the Internet. OK, so I know it’s very important for, you know, for my listing. Because I need to to make my product to be seen … .There are a lot of ways, but at the end you don’t know which way is better, but people tell you or you need to make the keywords right, you know like user the the correct keywords and also it’s it’s I feel like it’s like when you don’t have good sell. … .it’s like a negative, it’s getting negative and then positive. So for example, my projects like for the past month. It went really good. OK, so I’ve noticed when you search, it’s easy to come to the top … .You know what I mean? Yes. So it’s going to be again, like when people are doing good, they are doing better and better.

Akemi, mother of a 14 year old and an 11 year old, north of England

Conversations

Akemi’s prospection (Iser, Citation1974; Ytre-Arne & Das, Citation2021) about algorithmic ranking draws upon her practical (Cotter, Citation2022) knowledge and understanding, which isn’t quite there yet, she feels in fully grasping what goes on underneath feeds, but which is enough for her to wish to keep her own children away from appearing on anybody’s feeds, whilst she tries to manage her own visibility on a selling platform. But it also transpires, that the sources of Akemi’s musings, and generic knowledge were not solely to do with her own professional, non-mechanical grasp of algorithms. It appeared a key source was also conversation with peers, and co-sellers on Etsy –

What other sellers say. everyone is saying it’s difficult, you know. One person even said he used to work Google. And then he found their the their system is a bit strange. You never you. You never be able to know what actually helps you. You … will be listed on the on the top, you know, but my experience is that cause we did very well last month. So I feel like the product is like when you search it’s easily to go on the top.

Akemi, ibid

Cotter, writing about critical algorithmic literacies (Cotter, Citation2020), argues persuasively that “people draw on a deep well of local, contextual insight that grants coherence and legibility to what algorithms ‘want,’ what they do, how, and with what effect” (p 237–238). Akemi, here, and several of the parents who I speak about next, draw upon such a well of knowledge, which, in addition to containing professional competencies and skills, often, also includes shared parenting talk, with other parents, or sometimes colleagues, extended family and others. For instance, Isbael, below, draws upon conversations with fellow parent friends, and her ex-husband to form her ideas about algorithms and data journeys, before settling on her own personal philosophy of “not thinking too hard about it all, as an exhausted single mum.” Her practices around technology are shaped by her draining work and life schedule as a single parent she says, but she argues she is informed and aware of algorithmic harms, for instance. When I engage her more about where her inklings and awareness might have arisen, she draws my attention to conversations – conversations with fellow parents, and with her former partner.

Yeah, I think it was obviously very early days in, in the world that we’re in now with advertising, beating individualised and pinpointed on things were looking at. My friends and I spoke about it – and many of them have children my daughter’s age and often switch various things off. So I think it probably freaked me out a little bit and … . I think it probably did sort of worry me quite a bit … …

Isabel, mother of a 4 year old, Shropshire

The role of others is particularly key for parenthood and parenting, as research on the mediation of parent networks shows (Das, et al, Citation2023). In addition to what we already know about parents’ knowledge-sharing, advice and support in online platforms and communities (Das, Citation2019; Madge & O’connor, Citation2006), contemporary parenthood in digital societies involves numerous overlapping parent communities on WhatsApp groups for instance, bringing entire classes, or grassroots sports groups together. Parents, as Isabel and others I spoke to, also speak about technology, and these conversations shape understandings and practices. But conversations also occur in the home, with partners and ex-partners, with many parents often speaking of their partners or former partners as sources of their first, remembered bits of skepticism about datafication, and algorithmic shaping. Sometimes, like Isabel, initially, these conversations triggered further thinking, critique and research, and then, as practices with technology are a work in flux and progress, people’s approaches change and morph –

The Facebook and different things and my ex-husband was very obsessed with. Everybody was listening and watching us, you know, it was quite a conspiracy theorist. So I think it was probably influenced little bit by him as well. So I would would do that. I probably say there’s a number of reasons now why I don’t feel the same. Number one, I suppose I’ve come away from that influence. He was very intelligent but also quite paranoid like character in lots of ways … It actually I tend to see things quite a lot that are helpful to me and are of interest to me.

Isabel was not alone in my dataset, in labeling concerns or critique about technology, in particular datafication and algorithms, as being cynical, or believing in conspiracies. Parents who showed a high degree of skepticism or caution often prefaced their words with a preamble about how what they are about to say probably makes them sound like a conspiracy theorist. Like Isabel, Audrey too speaks of conversations at home, in her case, with her partner, as potentially useful to think about algorithmic shaping, but also, like Isabel, labels this as overly cynical or close to believing in conspiracies. It is important, I suggest, to pay attention to these discourses where critique is conflated necessarily with over the top cynicism or believing in conspiracies. At a time when technology is often heralded as the solution to a wide ranging set of problems, it is perhaps inviting to suspend the work of critique, particularly when coping with parenting practicalities, often doubled up with trying circumstances and pressures, which exhaust, or deplete.

These conversations, at home, or with others, shape parents’ algorithm awareness, understanding and degrees of critique in wide-ranging ways. In contrast to Isabel or Audrey, Rhianne, mother of a toddler and a new born, confidently asserted to me that there was nothing particular to be concerned about in relation to algorithmic shaping or the journeys of data, because her partner worked for a global technology company, and because their home was “hooked up” to many devices which were not even on the market yet. Rhianne, like Mehmet or Patrick, who we met earlier, is also a teacher, but in speaking to me about what shapes her own algorithm awareness and literacies, her main reference point is her partner who works inside the tech industry. Rhianne later says she is upset that she appears to receive a barrage of posts about neonatal loss in her timeline because of a donation she made to a neonatal loss charity, leading us to have a conversation about what shapes her timeline. Rhianne muses that it might be chronological, or perhaps small amounts of data might be flowing across platforms, but relies significantly on her partner’s perspectives from within the technology industry as she forms her views.

Events

Just as conversations with peers and families, or professional competencies matter, in shaping and divergently resourcing parents’ practices with algorithmic interfaces, so do digital events – episodes or incidents ranging from life-changing moments, such as having a baby, to small, apparently unnoticeable but nonetheless meaningful moments of shift – where parents practices with data and algorithms appear to shift, morph or swivel. Hodkinson and Brooks (Citation2023) speak usefully of watershed moments as crossroads“significant events, developments or changes that occur within the journey of caregivers” – and in developing my ideas on parents’ algorithm literacies and where they come from, an attention to these, often digital, crossroads is key. Borrowing from Hodkinson & Brook’s conceptualizing of crossroads (see also Das, Chimirri, Jorge, & Trueltzsch-Wijnen, Citation2023 on crucible moments) in parents’ journeys, I draw attention here to digital events in parents’ lives where their relationships with technology broadly, and particiuarly with data and algorithms shifted. Delyse is a young mother of twin daughters, who has coped with very trying circumstances in the run up to having children, and still continues to cope with trying circumstances, maps out to me when and why she decided to pause critique and embrace a very high degree of dataveillance (Mascheroni, Citation2020; Van Dijck, Citation2014) into her children’s lives. Delyse tells me how, prior to having her now four year old daughters, she had significant mental health struggles, a high amount of debt, and was without a partner. Unable to cope with the pressures of twin infants and no personal support at all whilst tackling emotional and financial crises, she discovered tracking at the hospital – which, for her, was a pivotal moment of shift –

So it came from being in the hospital. They track in the hospital and paper. You know, their feeds and how much they have. And then I thought, let me carry that on into home and then immediately I used on my notes app and found that it was just too hard to follow. So then I searched on Google baby tracking apps and it came up with this one. It simply just called tracker. A lot of what I tried to do was make sure it wasn’t any personal information that I was put in and that I was putting in. So instead of putting in individual names, I would just put in that it was a bowel movement and then try to just remember that one had done it so that one hadn’t.

Delyse, mother of twin 4 year olds

Delyse, like Isabel, speaks to me of the exhaustion of single parenthood, and her attempts to stay in control by tracking and using surveillance. When I draw her into a discussion about what she thinks distinguishes the two moments of surveillance – nurses tracking new mums and babies on paper (at least, in the moment), and parents tracking babies on an app – she immediately outlines a range of ways in which she tries her best to protect her children’s data. She tells me they are not to be “fodder” for algorithms, and when I draw attention to the tracking in their infancy, she speaks of the many ways in which she tried to conceal the infants’ identity from the app that she relied on so heavily. Delyse remembers this moment, from four years ago, as a conscious shift, in embracing certain kinds of technology, but distancing herself (and indeed, her children) from other kinds, such as social media platforms. Parents’ understanding, awareness, technical and critical competencies are projects in flux, and divergent across diverse platforms. Memories of significant or apparently even unnoticeable yet meaningful digital events in their lives shape what they bring to their interfaces with data and algorithms (Livingstone & Blum-Ross, Citation2020). Becoming a parent, or entering a new stage in parenthood – often becomes an event – a memory – a crossroad (Hodkinson & Brooks, Citation2023) which shapes practice and literacies, as Lewis demonstrated. When Lewis and his partner discovered that their social media newsfeed was becoming significantly replete with baby products and adverts when they kept their fetus off social media, but did indeed search for a variety of information topics and advice about new parenthood, they made the decision then, to keep their child offline for as long as possible. Lewis spoke of the importance of that moment of discovering a strangely baby-themed newsfeed, and not being able to figure out why it looked that way. Lewis’s technical understanding of algorithms underneath platforms and apps is limited, by his own admission, but that moment on, he decided to limit who sees his toddler on a nursery app.

For Audrey, a strong recollection of being tripped up whilst trying to get her daughter a provisional drivers’ license, and, having discussed these online with friends and having searched for these too, she fails to notice a sponsored story which misleads her. For Audrey, that moment persists, with a preamble to me about how it is small and inconsequential, as a moment to think about why she sees what she sees online, and how to navigate social media differently. For Terri, mother of two adopted children who she is keen to protect online, for numerous reasons, the pivotal moment occurs in a farm. Terri recalls seeing a child repeatedly on her newsfeed to the extent that she recognized him when out and about –

Because I follow, I follow like a local mum ..and she’s always posting her kids. It’s like I know their age. I know their likes dislikes. I know everything about that child and I actually met the child in a farm he just walked up, not with the mum. Just walked up and I went. Ohh hello (name) … . And I was like Oh my God, why did I say that I cause I felt like I knew him? And I felt really creepy and I thought, ohh, I’m really sorry …

Terri, mother of a 3 year old and a 2 year old, Hampshire

That moment in the farm altered Terri’s understanding of and approach to timelines. The understanding she reveals is not technical, but, quite like Cotter’s (Citation2020) respondents, it is practical, and the boy in the farm persists as a significant pivot in her practice. These digital events – nonevents perhaps to the casual observer, came up in numerous conversations with parents as they spoke about their understandings, awareness and literacies with algorithms and data more broadly. A fleeting realization one day, out and about, or a moment of figuring something out about a platform, or a push from somewhere to embrace or reject a particular technology – act as shapers in parents’ practices and philosophies, I found, in numerous conversations.

Dimensions, markers and implications of parents’ algorithm literacies

I began this paper with a suggestion that we approach parents’ algorithm literacies – one component of their wider platform literacies (DeVito, Citation2017; Reisdorf & Blank, Citation2021) – as always contextual practices which address their negotiations of algorithms as part of their own experiences of parenthood as well as their practices of parenting. This, in keeping with media, digital and data literacies scholarship, and responding to developments in the study of digital parenthood, lead me to argue that parents’ algorithmic literacies have four components. 1) parents’ awareness (Gran, Booth, & Bucher, Citation2021) of the presence of algorithms, 2) parents’ technical competencies , 3) parents’ critical capacities (Snyder & Beavis, Citation2001) to make sense of what algorithms represent and 4) parents’ practices to protect their and their children’s best interests including mentoring, brokering and shaping (c.f. Livingstone & Blum-Ross, Citation2020) the role of algorithms in their children’s lives. In what follows (see ), I identify specific, practical markers of each of these dimensions of algorithm literacies in my dataset, and for each aspect, summarizes some of the implication for parenthood and parenting. I am inspired here particularly by Swart’s (Citation2023) exposition of news literacies, where she locates tactics in engaging with news content which protect personal data privacy, as well as Oeldorf-Hirsch and Neubaum's exposition of the cognitive, affective and behavioural dimensions of algorithm literacies (Oeldorf-Hirsch & Neubaum, Citation2023).

Table 2. Dimensions, markers and implications of parents’ algorithm literacies.

Parents’ algorithm awareness

One of the central questions of the heart of this project, which guided my fieldwork was, how aware (Hamilton, Karahalios, Sandvig, & Eslami, Citation2014) parents were of the presence of algorithms and algorithmic curation in a variety of online environments they frequented, and in which they shared stories of their own parenting, and sometimes many glimpses of their children. I was keen to investigate the role of algorithmic shaping and curation, in terms of the visions of parenting and parenthood which parents I spoke to gathered from their everyday scrolling online. This, in turn shapes their own parenting, decisions, and emotional reactions, and the contributions they made to a variety of newsfeeds and timelines. Rijula, for instance, drew my attention to why she thinks natural parenting, exclusive breastfeeding and nonuse of formula shows up more in her mediated view of the world, mentioning algorithms specifically in relation to her own searching and browsing for exclusive breastfeeding material. Likewise, Aadi, a doctor, draws to my attention the ways in which he is aware of his search potentially looking different from others –

I just searched for the word tantrums and again the top thing that’s coming up is NHS and I suspect that probably doesn’t come up for other people and so probably because I work for the NHS or it knows that I’ve searched I I’ve been on lots of websites that are NHS related for work reasons and because I’m normally logged into my browser which is permanently logged into my Google account, so it clearly knows what I’m browsing.

Aadi, father of a 3 year old, south of England

Aadi went on to explain to me the ways in which he ignores and often resists, using technical skills, the myriad commercial intrusions he encounters when searching potty training advice, and how he interprets social media content on other people’s toddlers speaking and making developmental progress. Whilst he did not explicitly use the word algorithm the attention he paid to ranking and the possibilities of filtering, were starkly different from, for instance, Lara, the mother of older teenagers, who thought her timeline was chronological. On reflection, and in conversation, she thought back to several years ago on how much content on children succeeding at school, she saw, when struggling with her own child’s difficulties at school. I was curious to find out how aware parents are of the role of algorithms and the presence of algorithmic filtering, and whether this awareness mattered. Conversations with parents established to me clearly that whilst a minority of those I spoke to appeared to be algorithm aware (Gruber et al., Citation2021), the parents who were indeed aware of some degree of algorithmic shaping, whether or not they referred to the word algorithm, specifically, had a degree of separation from the content of parenting related stories or news feeds, and what they represented, a clearer understanding of time, chronology and the role of curation and clustering (c.f. Lupinacci, Citation2022) or to stumbling upon news stories which worried them about their children on a news aggregator site, for instance. Naturally, none of the parents I spoke to, being non-experts, had any access to the real algorithms underlying any of the interfaces we spoke about. But being aware of the presence of even the most nebulous and hazy of rules shaping the vision of parenthood, they see, and the feeds that they contribute to made a difference to the ways in which the interpreted content and contributed to it. I suggest that this awareness of the presence of algorithms, whether or not they are described as such, is significant (Fouquaert & Mechant, Citation2022) for parenthood and parenting, not solely in terms of parents being able to potentially, identify and resist the clustering of intensive parenting on the news feeds, but also in the skepticism, caution and scrutiny they might bring to search results, timelines, and more. Here, I note, that awareness does not, of course, function in isolation, and often requires a degree of curiosity about why things come to be a certain way in algorithmic environments, as Oeldorf-Hirsch and Neubaum (Citation2023) suggest. Being aware of the presence of algorithms, I found in my data set, acted as a shaper to the decisions parents made around personal privacy (Shin, Kee, & Shin, Citation2022) about the degree to which they expose their children, their children’s faces or even stories about their children online. When parents were aware of personalized search results, it enabled a degree of scrutiny. When parents were aware of the possibility of curation and filtering on newsfeeds, their interpretation of parenting content shifted, and when they made sense of, for instance, why certain children’s products might be recommended over others, or why certain children’s videos or parenting videos might auto-play over others, I found a degree of distanciation from taking the content at face value, or acting on it unthinkingly.

Parents’ technical competencies

Whilst literacies are never an amalgamation of button-pushing skills, technical competencies are a key aspect to algorithm and platform literacies, within the context of digital literacies broadly (van Deursen & van Dijk, Citation2015), and they are not divorced from broader critical capacities amidst broader skill gaps (van Deursen & van Dijk, Citation2015). Parents who knew how to maneuver their way around search engines, social media platforms, news aggregator sites and apps of various kind, displayed a set of skills around muting, selecting, deselecting, guarding, creating zones and rings of privacy levels – which amounted to more than a sum total of these parts. Parents’ algorithm skills (c.f. Hargittai, Gruber, Djukaric, Fuchs, & Brombach, Citation2020) I found, fed out of parents’ algorithm awareness I allude to earlier, and sat hand in hand with critical competencies. Many, of course, who did not display a high level of technical skills, chose often to sway to extremes when it came to children’s content online. Some, like Akemi had a blanket ban on her children featuring on all her friends’ newsfeeds, speaking also about her assumption that visibility of all content was similar across the board for everyone (Eslami et al, Citation2015). I saw how Akemi’s self-directed learning, chatting and musing about algorithms on her selling platform (see also Klawitter & Hargittai, Citation2018) were gradually building up a sense of algorithm awareness, even if technical competencies, by her own admission, were behind what she would like them to be. Some, also with lower technical skills, had no hesitations whatsoever in posting about children and their pictures online, often despite a broader awareness in principle of some of the risks attendant to datafication and dataveillance broadly. Mehmet, who we met earlier, uses his critical capacities as a teacher of secondary children and encourages many conversations at home and at school on scrutinizing sources and evaluating information. But despite this, Mehmet’s own technical skills and broader algorithm awareness, appear fairly low. Mehmet asserts to me, for instance, that the feed he sees on his timeline is “completely chronological” (Rader & Gray, Citation2015), and that search results are similar across the board, depending on the keywords one uses.

Parents who were algorithm aware often demonstrated practices of leaving fewer data traces (such as likes or shares) to actively alter what they see. Liam, for instance, who very algorithm aware, appeared to have stumbled and learnt over time how to make technical adjustments to how many traces he left of himself online -

It might be an also think something I search and I deleted my history and I was like well, why has Google remembered that? And then I thought ohh Google remembers your history, you have to turn your activity off. I was like ohh how do I do that? Ohh you you go this this this turn your activity off that that’s that that that thing is….You know, they say that’s, you know why VPN’s are so publicised so much. Because, you know, apparently the ISP, your Internet service provider is always watching Google’s always watching. So. But yeah, I have WhatsApp as well so….And you know Big Brother watching me, kind of why?

Liam, father of a toddler and an infant, south of England

Many used interface options such as muting, deselecting or reporting to alter the shape of feeds, but this did not necessarily sit in complete alignment with a broader critique of algorithmic interfaces, or a firm awareness of algorithmic shaping necessarily. Liam’s talk demonstrates the pondering, uncertainty, experimentation that often forms part of journeys which develop varying degrees of technical competencies. My conversations with parents shored up important if subtle practices of scrolling quickly, or ignoring content, or refusing to click on suggested or recommended content with the specific intent of retraining algorithms, even when the words training or algorithms did not feature in parents’ talk. I also draw a distinction here, between parents’ self-reported levels of awareness or technical competence, and their actual practice. For instance, Jenny spoke to me with a 2 week old infant whilst recovering from a cesarean section after a planned home birth. There was a high level of data sharing including online photo albums and breastfeeding apps to measure progress with a sense of with her reiterating how she trusts these apps. This sat alongside her confidence that she knows how platforms function, and is very skilled with interfaces, which she indeed was. But there was a significant gulf between reported levels of self-confidence and actual alertness and actions, which leads us to consider the critical capacities that must accompany technical competencies.

Parents’ critical capacities

Critique has been at the heart of media and digital literacies scholarship (Ávila & Pandya, Citation2013; Buckingham, Citation2007; Livingstone, Citation2008), with decades of research pointing to the central role of critical capacities within digital literacies (Coiro, Citation2015; Kress, Citation2003). Parents’ awareness of the presence of algorithms and their abilities to technically maneuver algorithmic interfaces were tied to their broader critical capacities, my interviews showed. I note here that nearly all parents displayed some degree of critical digital awareness in terms of content around children’s online safety as covered within the English school curriculum across the key stages, and, for many, this also extended to a critical stance on commercial institutions, with many speaking of myriad ways in which to avoid sponsored adverts, posts and persistently intrusive commercial recommendations (see here Polizzi, Citation2021). But critical capacities extending further forward, beyond a basic understanding of privacy, to, more broadly, the logic of algorithmic curation for platforms, or the broader implications of an increasing algorithmification of everyday life and the public domain, was far more scant. Kress’s (Citation2003) assertion of the importance of time, distance and reflection (see also Pangrazio, Citation2016) was a key facet of some parents’ musings about their practices around algorithms. Audrey said, when prompted to reflect on her pain at seeing a barrage of posts about successful young adults on her newsfeed –

It’s mood inducing, because I would have understood that they would have tried to do it based on interactions and because you interact with something that gives you joy or angers you it stimulates you in some way. But if they were doing it based on those binaries of what makes me happy and what enrages me, wow, I mean that’s shocking. I would I would understand if they would do it on what makes you happy, but if it’s if it’s based on those binaries, that’s quite shocking that they would purposefully antagonise and enrage. Although again, it’s a cynicism that’s I’ve had before in the back of my mind.

Audrey, mother of an 18 year old, 12 year old and 2 year old, south of England

Lara, for instance, or Audrey, both of whom confessed to never having thought about what lies behind a timeline, both reflected in conversation with me about things they had seen a while ago, and wondered why that was. Audrey considered whether algorithmic clustering and curation had anything to do with platforms manipulating people’s emotions. This reflection, of course, was in the course of conversation, but ties in with the importance of distance and time, in developing critical capacities around algorithms, away from the immediacy that these environments beckon. One of the striking facets that came out of this set of conversations with parents was that broader critical capacities around digital environments did not necessarily translate into critical capacities with algorithms and datafication, and, unsurprisingly perhaps, the presence of technical competencies in parents’ practices did not sit in neat overlap with critical algorithm literacies. Cotter identifies two essential implications of critical algorithm literacy in discussing these in the practices of BreadTubers (Cotter, Citation2022). Cotter suggests that critical algorithm literacies enable people to recognize their role within algorithmic systems and their attendant logics, and that this then affords people the opportunities to direct their own actions in ways that might advocate for their best interests. Polizzi (Citation2021) extends the scope of critical literacies from evaluation of content alone, and extends it to a broader reflection on the political economy of platforms and its role in civic life. In relation to these in the context of parenting, in I outline the markers I found of parents’ critical capacities in relation to algorithms, and I suggest that these capacities around understanding the if-then logic of algorithms, the role of personal data in this context, comprehending platforms’ commercial purposes and understanding implications of the use of algorithms within the public domain, particularly in relation to children’s futures links to how adeptly parents are able to operate within algorithmic interfaces to achieve and support their and their children’s best interests.

Parents championing best interests

As previously noted, the majority of attention to parents’ digital literacies has focused on the implications of these for sharenting (Barnes & Potter, Citation2021; Siibak & Traks, Citation2019) in relation to the datafication of childhood (Barassi, Citation2019; Mascheroni, Citation2020). Whilst doubtless important, the ability to champion children’s and parents’ best interests, extends beyond sharenting alone, to the degree to which parents are able to advocate for themselves, and read themselves within algorithmic systems. Being able to scrutinize attempts by institutions, for instance, to measure, surveil or count families and parents’ data (Edwards & Ugwudike, Citation2023), being able to understand, engage with, or even challenge private and public institutions applying algorithms to generate outcomes which apply to parents and their children, being able to have doors and conversations open with children across ages, on navigating algorithmic interfaces, or being able to interpret the world of parenthood as seen on timelines and feeds, on recommendations, or search results – is a vital component of parents’ algorithm literacies. Livingstone and Blum-Ross (Citation2020) draw out succinctly parents’ numerous positive roles in datafied domains in digital societies, as mentors and brokers for their children, as does Alper (Citation2023) in listening to how parents advocate for their children. Lewis father of a toddler and expecting a new baby appliance to me for instance how he has engaged with his children’s nursery about their use of the nursery’s app where Lewis insists that pictures of his son must not be sent to other parents and is perplexed as to why not everybody insists on this.

We are trying to safeguard him as much as we can. To limit his photos out there. Nursery have got an app called Famly. F AM ILY no I in it. And they update news on his Day at Nursery and what he’s eaten when he had his nappy changed, how long he slept. And they share photos every now again of what he’s been up to with other people and stuff like that. Now we don’t allow anyone to have photos from him on that app. But you know, we see plenty of other photos of other kids on the app.

Lewis, father of a 2 year old and expecting a new baby, south of England

Throughout Lewis also displays a broader awareness of how platforms and algorithms operate within other spheres of parenting, and these broader critical platform literacies (DeVito, Citation2017), for instance in engaging with his children’s nursery about the app links to the sort of championing that critical algorithm literacies must encompass. Many parents of toddlers and younger children who said they felt algorithmic selection and filtering were things they accept as part of modern life, felt concerned and anxious that the same technical process might lead to their teenagers being recommended content that was damaging to them. When probed about what they might do to address this, whilst the vast majority spoke of technical controls, many also spoke of the value of conversation and engaging children in an ongoing dialogue. I suggest, that this mentoring and brokering role Livingstone & Blum-Ross identify as part and parcel of parenting in contemporary digital and datafied societies, is a key dimension of parents’ platform and algorithm literacies, in a world where families and parents are increasingly data (c.f. Edwards & Ugwudike, Citation2023), and where advocating for their own and their children’s best interests, matters more than ever. Championing best interests brings together technical and critical capacities as above, into an active role in shaping children’s experiences in online environments. It affords opportunities for parents to engage with children as well as countless institutions involved with youth to advocate for themselves and their children and amalgamates technical and critical capacities into an active role in shaping children’s experiences in datafied societies. In the conversations I had with parents, such championing related largely to children’s data privacy and security – as Lewis showed with the Famly app, or as Delyse attempted in anonymizing bowel movement tracking to the best of her capacities, or as Terri attempted with never posting her children’s faces. Other elements to life with algorithms, including filtering, curation, recommendations and the broader use of algorithms in public domains were less clear to parents, including those with high critical capacities.

Discussion

One of the key things to note about the dimensions and markers of parents’ algorithm literacies is how fluid these are. The same parent might display vastly different practices at different stages of parenting, and their practices and literacies might diverge across platforms. To conclude, this leads me to end with a typology of the 30 parents who introduced me to algorithms and their lives in this project. I speak here of those who are algorithm aware in principle, those who are alert in practice and those who are active shapers in relation to algorithms. I do not intend these categories to be watertight, because they are not, in practice. As parents are on unfolding journeys of parenthood, in an inherently cross media and multi-platform life, the same parents may display a greater alignment to one position on this typology in relation to a particular platform, but display an alignment with a different position in relation to another platform. Likewise, as their children grow, as new digital events occur in their lives, and as their own trajectories and journeys (Hodkinson & Brooks, Citation2023) unfold, their position on any typology remains anything but static.

Acknowledging this fluidity and movement is key, particularly when reading a simple, and by definition not watertight typology of the 30 parents I spoke to, in terms of their literacies with algorithms. First, I speak of the aware in principle. The vast majority of the parents would sit inside this point in the typology. They displayed nearly unanimous awareness of interpersonal risks in a broader digital environment, with significant overlaps with the key stage curriculum in schools around online safety. At this point in the typology, I noticed a remarkable separation between a broader, and doubtless uneven, recognition of algorithms and under-the-bonnet rules, alongside the willingness to sit with these, or even, to consider these as a trade-off in mediated societies. A functional, transactional approach (see here Das, Citation2023) to algorithm and data dominated at this point in the typology, where parents expressed in myriad ways, that for better or for worse, they have learnt to live alongside algorithms, without significant changes to their own practices. I suggest that the aware in principle point in the typology is of particular interest to those of us researching parenthood and parenting, because it represents, on the one hand, a success story, of parents reflecting on broad issues of privacy and security, but on the other hand, it represents equally, a remarkable, and not implication-free, willingness to accept, and sit alongside an increasing datafication of parenting and parenthood. Parents who were alert in practice demonstrated not solely many of the markers of algorithm awareness as identified in , but also many practices of technical choices and tactics designed to maneuver within algorithmic systems in recognition of the systems’ broader power and in resistance of these (Cotter, Citation2020). The difference between this point on the typology and the former rests not solely on a higher degree of algorithm awareness, but a lower willingness to live with these, without any discernible action. Alertness in practice, might often be dismissed as button-pushing alone. But as numerous accounts of tinkering with privacy settings, deliberately unfollowing, rapidly scrolling, or considering the implications of past searches showed, alertness in practices was more than the sum total of disparate technical skills. Last, I draw attention to the active shapers, parents, whose algorithm awareness and technical were high, but also acted in conjunction with their critical capacities and broader acts of championing theirs, and their children’s best interests. Such acts of active shaping might involve consistent conversations with children about algorithmic shaping, conversations with institutions such as nurseries of schools, or actively deciding to significantly alter one’s patterns of engagement with platforms.

In this paper, drawing upon the many decades of scholarship in media and digital literacies, I noted, first that like all practices, the algorithm literacies of the 30 parents who shared their stories with me – have contexts and histories. I argued that this means paying attention to the uneven resources which shape parents’ algorithm literacies and listening closely to the stories of knowledge sharing and parental talk that comes through in parents’ conversations (see Dogruel, Citation2021 on folk theories of algorithmic operations; Bishop, Citation2019 on algorithmic gossip, and; DeVito, Citation2017 on exogenous sources of algorithmic knowledge). It means noticing the significant role of family, friends, bystanders, and known and unknown others in vernacular (Pangrazio & Selwyn, Citation2019) practices of knowledge sharing in the broader context of parents’ algorithmic knowledge (Cotter & Reisdorf, Citation2020), and paying attention to their own personal and professional competencies which shape their algorithm literacies. An attention to the histories of parents’ algorithm literacies means considering the role of digital events in parents’ lives, where they speak of memorable incidents or key points of shift in their lives (c.f. Hodkinson & Brooks, Citation2023), which alter their practices with algorithms in one way or another. My analysis of parents’ myriad practices – markers of their algorithm literacies – led me to tease out some of the facets of parents’ algorithm literacies, encompassing both technical and critical attributes. Here, my argument has been that we see algorithm literacies as deeply contextualized practices, which are the combination of parents’ technical skills, their algorithm awareness, including interpersonal, commercial, and institutional dimensions (see here Livingstone etal., Citation2021 on dimensions of privacy literacy), the personal philosophies and strategies which sit at the intersections of their role as digital parents within broader parenting cultures and the logic of intensive parenting (Lee, Bristow, Faircloth, & Macvarish, Citation2014), and their roles as shapers, mediators, mentors and brokers as Livingstone and Blum Ross argue in their work on parenting for digital futures (Livingstone, Stoilova, & Nandagiri, Citation2020). The aware in principle, the alert in practice and the active shapers in the fluid typology I spoke of, are not, then, fixed to their roles, but rather, in fluidity and movement, across the span of parenthood, learning, unlearning and re-learning their roles in relation to ever changing platform norms, with their algorithm awareness, technical competencies, critical capacities and abilities to champion their and their children’s best interests, are in constant flux. As Gunther Kress muses about changing literacies in changing media environments, in closing his work on Literacy in the New Media Age (Kress, Citation2003), parents, too, in relation to platforms, I suggest are “the makers of meaning … . Not free to do as we would wish, but not as the victims of forces beyond our control either” (p 176).

Acknowledgments

I am grateful to the sabbatical scheme of the University of Surrey which allowed me the time to do this work. This paper was presented, as a draft, to colleagues at the University of Bergen, to whom I am grateful for their detailed comments, with particular thanks to Brita Ytre-Arne for hosting me. I am grateful to have been supported by the Erasmus mobility scheme which enabled my visit to Bergen. I am also grateful to colleagues including Sonia Livingstone, Usha Raman, Veronica Barassi and Giovanna Mascheroni at the Datafied Family event in June 2023 for their valuable feedback on the draft of this paper presented at the event. My thanks also to those colleagues who have informally read drafts of the broader project this is part of, including Tereza Pavlickova, Ana Jorge, Francisca Porfirio, and Ana Kubrusly.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

The work was supported by the University of Surrey.

Notes

1 I am grateful to conversations with colleagues at the Datafied Family event, in June 2023, who drew attention to the practical advantages of not splitting media and digital literacy education into too many different literacies, but recognized also the importance of considering algorithm literacies and the specific demands presented by algorithms. To colleagues at the event, I also owe gratitude for encouraging me to think of algorithm literacies as nested within other literacies.

References

  • Alper, M. (2023). Kids across the Spectrums: Growing up Autistic in the digital age. Cambridge, MA: MIT Press.
  • Ávila, J., & Pandya, J. Z. (2013). Critical digital literacies as social Praxis: Intersections and Challenges. In C. Lankshear, & M. Knobel, (Eds.), New Literacies and Digital Epistemologies (p. 10006 Vol. 54), Peter Lang New York. 29 Broadway 18th Floor. New York, NY: Peter Lang.
  • Barassi, V. (2019). Datafied citizens in the age of coerced digital participation. Sociological Research Online, 24(3), 414–429. doi:10.1177/1360780419857734
  • Barnes, R., & Potter, A. (2021). Sharenting and parents’ digital literacy: An agenda for future research. Communication Research & Practice, 7(1), 6–20. doi:10.1080/22041451.2020.1847819
  • Bishop, S. (2019). Managing visibility on YouTube through algorithmic gossip. New Media & Society, 21(11–12), 2589–2606. doi:10.1177/1461444819854731
  • Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44. doi:10.1080/1369118X.2016.1154086
  • Buckingham, D. (2007). Digital media literacies: Rethinking media education in the age of the Internet. Research in Comparative & International Education, 2(1), 43–55. doi:10.2304/rcie.2007.2.1.43
  • Buckingham, D., Banaji, S., Carr, D., Cranmer, S., & Willett, R. (2005). The media literacy of children and young people: A review of the research literature.
  • Caronia, L. (2009). The Cultural Roots of knowledge vs. The Myths underlying the contemporary digital turn in education. In Media Literacy in Europe. Controversies, challenges and perspectives (pp. 25–33). Euromeduc.
  • Charters, E. (2003). The use of think-aloud methods in qualitative research an introduction to think-aloud methods. Brock Education Journal, 12(2). doi:10.26522/brocked.v12i2.38
  • Coiro, J. (2015). Purposeful, critical, and flexible: Vital dimensions of online reading and learning. In EUROMEDUC (Ed.), Reading at a Crossroads? (pp. 67–78). Routledge.
  • Cotter, K. (2022). Practical knowledge of algorithms: The case of BreadTube. New Media & Society, 146144482210818. doi:10.1177/14614448221081802
  • Cotter, K. M. (2020). Critical algorithmic literacy: Power, epistemology, and platforms. Michigan State University.
  • Cotter, K., & Reisdorf, B. C. (2020). Algorithmic knowledge gaps: A new horizon of (digital) inequality. International Journal of Communication, 14, 21.
  • Das, R. (2011). Converging perspectives in audience studies and digital literacies: Youthful interpretations of an online genre. European Journal of Communication, 26(4), 343–360. doi:10.1177/0267323111423379
  • Das, R. (2019). Early Motherhood in digital societies: Ideals, anxieties and ties of the perinatal. Routledge.
  • Das, R. (2023). Parents’ understandings of social media algorithms in children’s lives in England: Misunderstandings, parked understandings, transactional understandings and proactive understandings amidst datafication. Journal of Children and Media, 1–17.
  • Das, R., Chimirri, N., Jorge, A., & Trueltzsch-Wijnen, C. (2023). Parents’ social networks, transitional moments and the shaping role of digital communications: An exploratory study in Austria, Denmark, England and Portugal. Families, Relationships and Societies, 1–18. doi:10.1332/204674321X16841332631111
  • Das, R., Chimirri, N., Jorge, A., & Trultzsch-Wijnen, C. (2023). Parents’ social networks, transitional moments and the shaping role of digital communications: an exploratory study in Austria, Denmark, England and Portugal. Families, Relationships and Societies Online First https://bristoluniversitypressdigital.com/view/journals/frs/aop/article-10.1332-204674321X16841332631111/article-10.1332-204674321X16841332631111.xml . In .
  • DeVito, M. A. (2017). From editors to algorithms: A values-based approach to understanding story selection in the Facebook news feed. Digital Journalism, 5(6), 753–773. doi:10.1080/21670811.2016.1178592
  • DeVito, M. A. (2021). Adaptive Folk Theorization as a Path to algorithmic literacy on changing platforms. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–38. doi:10.1145/3476080
  • Dogruel, L. (2021). Folk theories of algorithmic operations during Internet use: A mixed methods study. The Information Society, 37(5), 287–298. doi:10.1080/01972243.2021.1949768
  • Edwards, R., & Ugwudike, P. (2023). Governing families: Problematising technologies in social Welfare and Criminal Justice. Taylor & Francis. doi:10.4324/9781003080343
  • Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., & Sandvig, C. (2015, April). ” I always assumed that I wasn't really that close to [her]” Reasoning about Invisible Algorithms in News Feeds. Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 153–162).
  • Fotopoulou, A. (2021). Conceptualising critical data literacies for civil society organisations: Agency, care, and social responsibility. Information, Communication & Society, 24(11), 1640–1657. doi:10.1080/1369118X.2020.1716041
  • Fouquaert, T., & Mechant, P. (2022). Making curation algorithms apparent: A case study of ‘Instawareness’ as a means to heighten awareness and understanding of Instagram’s algorithm. Information, Communication & Society, 25(12), 1769–1789. doi:10.1080/1369118X.2021.1883707
  • Gillespie, T. (2014). The relevance of algorithms. Media Technologies: Essays on Communication, Materiality, and Society, 167(2014), 167.
  • Gran, A. B., Booth, P., & Bucher, T. (2021). To be or not to be algorithm aware: A question of a new digital divide? Information, Communication & Society, 24(12), 1779–1796. doi:10.1080/1369118X.2020.1736124
  • Green, B. (1999). The new literacy challenge?. Literacy Learning: Secondary Thoughts, 7(1), 36–46.
  • Gruber, J., & Hargittai, E. (2023). The importance of algorithm skills for informed Internet use. Big Data & Society, 10(1), 20539517231168100.
  • Gruber, J., Hargittai, E., Karaoglu, G., & Brombach, L. (2021). Algorithm awareness as an important internet skill: the case of voice assistants. International Journal of Communication, 15, 1770-1788.
  • Hamilton, K., Karahalios, K., Sandvig, C., & Eslami, M. (2014). A path to understanding the effects of algorithm awareness. CHI’14 Extended Abstracts on Human Factors in Computing Systems, 631–642.
  • Hargittai, E., Gruber, J., Djukaric, T., Fuchs, J., & Brombach, L. (2020). Black box measures? How to study people’s algorithm skills. Information, Communication & Society, 23(5), 764–775. doi:10.1080/1369118X.2020.1713846
  • Hodkinson, P., & Brooks, R. (2023). Caregiving fathers and the negotiation of crossroads: Journeys of continuity and change. The British Journal of Sociology, 74(1), 35–49. doi:10.1111/1468-4446.12980
  • Iser, W. (1974). The implied reader: Patterns of communication in prose fiction from Bunyan to Beckett. Baltimore: Johns Hopkins UP.
  • Das, R., Chimirri, N., Jorge, A., & Trueltzsch-Wijnen, C. (2023). Parents' social networks, transitional moments and the shaping role of digital communications: an exploratory study in Austria, Denmark, England and Portugal. Families, Relationships and Societies, 1-18.
  • Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in algorithmic resistance. Proceedings of the ACM on human-computer interaction, 5( CSCW2), 1–44.
  • Kaun, A. (2022). Suing the algorithm: The mundanization of automated decision-making in public services through litigation. Information, Communication & Society, 25(14), 2046–2062. doi:10.1080/1369118X.2021.1924827
  • Klawitter, E., & Hargittai, E. (2018). “It’s like learning a whole other language”: The role of algorithmic skills in the curation of creative goods. International Journal of Communication, 12, 3490–3510.
  • Kress, G. (2003). Literacy in the new media age. Routledge.
  • Le‐Phuong Nguyen, K., Harman, V., & Cappellini, B. (2017). Playing with class: Middle‐class intensive mothering and the consumption of children’s toys in Vietnam. International Journal of Consumer Studies, 41(5), 449–456. doi:10.1111/ijcs.12349
  • Lee, E., Bristow, J., Faircloth, C., & Macvarish, J. (2014). Parenting culture studies. Springer. doi:10.1057/9781137304612
  • Leighton, J. P. (2017). Using think-aloud interviews and cognitive labs in educational research. Oxford University Press.
  • Livingstone, S. (2004). Media literacy and the challenge of new information and communication technologies. The Communication Review, 7(1), 3–14. doi:10.1080/10714420490280152
  • Livingstone, S. (2008). Engaging with media—a matter of literacy? Communication, Culture & Critique, 1(1), 51–62. doi:10.1111/j.1753-9137.2007.00006.x
  • Livingstone, S. (2010). Media literacy and media policy. Medienbildung in neuen Kulturräumen: Die deutschprachige und britische Diskussion, 33–44.
  • Livingstone, S., & Blum-Ross, A. (2020). Parenting for a digital future: How hopes and fears about technology shape children’s lives. USA: Oxford University Press.
  • Livingstone, S., Stoilova, M., & Nandagiri, R. (2020). Data and privacy literacy: The role of the school in educating children in a datafied society. The Handbook of Media Education Research, 413–425.
  • Lomborg, S., & Kapsch, P. H. (2020). Decoding algorithms. Media, Culture & Society, 42(5), 745–761. doi:10.1177/0163443719855301
  • Luke, A. (2013). Defining critical literacy. In Moving critical literacies forward (pp. 37–49). Routledge.
  • Lupinacci, L. (2022). Phenomenal algorhythms: The sensorial orchestration of “real-time” in the social media manifold. New Media & Society, 146144482211099. doi:10.1177/14614448221109952
  • Madge, C., & O’connor, H. (2006). Parenting gone weird: Empowerment of new mothers on the internet? Social & Cultural Geography, 7(2), 199–220. doi:10.1080/14649360600600528
  • Markham, A. N. (2020). Taking data literacy to the streets: Critical pedagogy in the public sphere. Qualitative Inquiry, 26(2), 227–237.
  • Mascheroni, G. (2020). Datafied childhoods: Contextualising datafication in everyday life. Current Sociology, 68(6), 798–813. doi:10.1177/0011392118807534
  • McCosker, A. (2017). Data literacies for the postdemographic social media self. First Monday.
  • Mihailidis, P. (2018). Civic media literacies: Re-imagining engagement for civic intentionality. Learning, Media and Technology, 43(2), 152–164. doi:10.1080/17439884.2018.1428623
  • Mollen, A., & Dhaenens, F. (2018). Audiences’ coping practices with intrusive interfaces: Researching audiences in algorithmic, datafied, platform societies. In R. Das, & B. Ytre-Arne (Eds.), The Future of Audiences: A Foresight Analysis of Interfaces and Engagement (pp. 43–60). London, UK: Palgrave Macmillan.
  • Oeldorf-Hirsch, A., & Neubaum, G. (2023). What do we know about algorithmic literacy? The status quo and a research agenda for a growing field. SocArxiv November, 18, 1–18.
  • Pangrazio, L. (2016). Reconceptualising critical digital literacy. Discourse: Studies in the Cultural Politics of Education, 37(2), 163–174. doi:10.1080/01596306.2014.942836
  • Pangrazio, L., & Sefton-Green, J. (2021). Digital rights, digital citizenship and digital literacy: What’s the difference? NAER: Journal of New Approaches in Educational Research, 10(1), 15–27. doi:10.7821/naer.2021.1.616
  • Pangrazio, L., & Selwyn, N. (2019). ‘Personal data literacies’: A critical literacies approach to enhancing understandings of personal digital data. New Media & Society, 21(2), 419–437. doi:10.1177/1461444818799523
  • Parsania, V. S., Kalyani, F., & Kamani, K. (2016). A comparative analysis: DuckDuckGo vs. Google search engine. GRD Journals-Global Research and Development Journal for Engineering, 2(1), 12–17.
  • Polizzi, G. (2021). Internet users’ utopian/dystopian imaginaries of society in the digital age: Theorizing critical digital literacy and civic engagement. New Media & Society, 25(6), 1205–1226. doi:10.1177/14614448211018609
  • Rader, E., & Gray, R. (2015, April). Understanding user beliefs about algorithmic curation in the Facebook news feed. In Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 173–182).
  • Reisdorf, B. C., & Blank, G. (2021). Algorithmic literacy and platform trust. In Handbook of digital inequality (pp. 341–357). Edward Elgar Publishing.
  • Shin, D., Kee, K. F., & Shin, E. Y. (2022). Algorithm awareness: Why user awareness is critical for personal privacy in the adoption of algorithmic platforms? International Journal of Information Management, 65, 102494. doi:10.1016/j.ijinfomgt.2022.102494
  • Siibak, A., & Traks, K. (2019). The dark sides of sharenting. Catalan Journal of Communication & Cultural Studies, 11(1), 115–121. doi:10.1386/cjcs.11.1.115_1
  • Siles, I., Espinoza-Rojas, J., Naranjo, A., & Tristán, M. F. (2019). The mutual domestication of users and algorithmic recommendations on Netflix. Communication, Culture & Critique, 12(4), 499–518. doi:10.1093/ccc/tcz025
  • Siles, I., Segura-Castillo, A., Solís, R., & Sancho, M. (2020). Folk theories of algorithmic recommendations on Spotify: Enacting data assemblages in the global south. Big Data & Society, 7(1), 2053951720923377. doi:10.1177/2053951720923377
  • Snyder, I. A., & Beavis, C. (2004). Doing literacy online: Teaching, learning and playing in an electronic world. Hampton Press.
  • Snyder, I., & Beavis, C. (2004). Doing Literacy Online: Teaching, learning and playing in an electronic world. New York, NY: Hampton Press, Incorporated.
  • Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media+ Society, 7(2), 20563051211008828. doi:10.1177/20563051211008828
  • Swart, J. (2023). Tactics of news literacy: How young people access, evaluate, and engage with news on social media. New Media & Society, 25(3), 505-521.
  • Toff, B., & Nielsen, R. K. (2018). “I just google it”: Folk theories of distributed discovery. Journal of Communication, 68(3), 636–657. doi:10.1093/joc/jqy009
  • van Deursen, A. J., & van Dijk, J. A. (2015). Internet skill levels increase, but gaps widen: A longitudinal cross-sectional analysis (2010–2013) among the Dutch population. Information, Communication & Society, 18(7), 782–797. doi:10.1080/1369118X.2014.994544
  • Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. doi:10.24908/ss.v12i2.4776
  • Ytre-Arne, B., & Das, R. (2021). Audiences’ communicative agency in a datafied age: Interpretative, relational and increasingly prospective. Communication Theory, 31(4), 779–797. doi:10.1093/ct/qtaa018
  • Ytre-Arne, B., & Moe, H. (2021). Folk theories of algorithms: Understanding digital irritation. Media, Culture & Society, 43(5), 807–824. doi:10.1177/0163443720972314