4,979
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Do parental control tools fulfil family expectations for child protection? A rapid evidence review of the contexts and outcomes of use

ORCID Icon, & ORCID Icon
Pages 29-49 | Received 18 Apr 2023, Accepted 13 Sep 2023, Published online: 29 Oct 2023

ABSTRACT

Among the parental mediation strategies promoted by policymakers to ensure children’s safety in a digital age is the use of parental control tools. A rapid evidence review was conducted to identify which families use parental controls and why, and the outcomes of such use, beneficial or otherwise. Of 1,656 articles returned by a keyword search of five research databases, the full text of 40 studies was coded to answer the research questions. The available research revealed that the use of parental controls depends on the age of the parents and children, their digital skills, parental involvement, and the motivation to reduce exposure to online risk. However, the consequences of use were mixed, with evidence of parental controls having both beneficial and adverse outcomes, limiting other outcomes or simply having no outcomes. While the review found little support for advocating parental controls as a stand-alone strategy, parents valued them when embedded in a broader approach to parental mediation and parent – child relations. The conclusions highlight the importance of a child-centred approach that holistically evaluates the potential of parental controls.

Impact Summary

Prior State of Knowledge: Parental controls are widely recommended by policymakers and technology industry as a way for parents to keep their children safe online, but research usually examines parental mediation in general, rarely focusing on the use or effectiveness of parental controls in particular.

Novel Contributions: A review of the effectiveness of parental controls reveals mixed results: some uses of parental controls bring benefits, for example to children’s safety, but others have no effect or limit children’s opportunities, and some have adverse results, for example to family communication.

Practical Implications: Policymakers and technology developers should not rely on the use of parental controls to ensure children’s safety in a digital world, because the evidence does not support the efficacy of parental controls and, if poorly designed, they may introduce problems for families.

How can society ensure that children’s experiences in a digital world enable their opportunities to benefit while minimising the risk of harm? Now that internet access is widespread in many countries, and with governments, educators, health services and businesses all keen to promote children’s and families’ digital engagement, the importance of empowering and protecting children is rising up on the agenda of policymakers and the public (O’Neill et al., Citation2020; Staksrud, Citation2016; UNICEF, Citation2021). Policymakers apportion responsibility partly to the regulators of digital providers of content and services and partly to end-users, including parents and caregivers (hereafter, “parents”) and children themselves. However, the legitimacy, implementation and effectiveness of current strategies are much debated, not least regarding the optimal balance between the responsibilities of government, businesses and families (Bulger et al., Citation2017; Lievens et al., Citation2018; Milosevic & Livingstone, Citation2018).

Recent years have seen considerable industry investment in a new generation of technical child protection measures, commonly called parental control tools or, more simply, parental controls. These comprise software that enables a responsible adult to control some or all the functions of a digital device or service used by a child to filter or limit or otherwise determine their access and use in ways intended to protect their safety (UNICEF and ITU International Telecommunication Union, Citation2020). Some operate at the level of the device – for example, they can limit the time spent on the device or on particular apps. Others manage the content children access online by filtering out pornography, hate and other potentially harmful content. Newer parental controls for social media may also mediate risky contacts by facilitating parents’ oversight of their child’s online interactions, for example through linked accounts or purchase approval tools. While many parental control tools focus on tracking usage and setting restrictions, some have nudging functionalities (e.g., sending educational messages) that aim to encourage children to develop digital habits (Bertrandias et al., Citation2023).

Such tools are often sold as the solution to the increasing pressure for responsible and competent parenting in a digital world and, possibly, the changing concerns of today’s parents in an increasingly opaque internet ecology (Bertrandias et al., Citation2023; Mauk, Citation2021, Citation2022). At the same time, parents are provided with more tools to monitor multiple aspects of their child’s digital engagement closely. With each app, device or service developing its brand-specific approach, the complexity of parents’ tasks grows commensurately. Governments favour parental controls, possibly because their use can be tailored to different family contexts, but also because such use alleviates public pressure to protect children through regulation (ITU (International Telecommunication Union), Citation2020, 2020b). Parental controls are also increasingly provided by the market, possibly to fend off government regulation of their products and services and because “family-friendly” and “safety tech” services can be profitable in their own right (Billinge et al., Citation2021; Perspective Economics and DCMS Department for Digital, Culture, Media and Sport, Citation2021). But do parental controls work? And do they meet families’ needs and expectations? It is important that the public receives impartial advice, and that policy is grounded in evidence, especially where it recommends parental controls (International, Citation2020, b). In support of this, we review the available research on which families use parental controls, why, and with what outcomes.

A considerable body of research shows that parents generally wish to play their part in mediating their children’s digital engagement, and that although at times their efforts are driven by anxieties fuelled by panicky media headlines and a culture of parent-blaming and shaming, they benefit from awareness-raising and digital literacy initiatives (Clark, Citation2013; Livingstone & Blum-Ross, Citation2020; Vickery, Citation2017). Yet parental controls are not widely used by parents. The international survey Global Kids Online (2019) found that parents of 9- to 17-year-olds prefer enabling mediation (e.g., encouraging their child, suggesting safety strategies) and restrictive mediation (rule-based restrictions on apps or screen time) instead of over the use of technical tools. There is some cultural variation: in Europe and South America’s wealthier countries, with a longer history of internet use and, doubtless, access to more advanced technology, parents preferred enabling mediation; by contrast, in Ghana, the Philippines and South Africa, parents favoured restrictive mediation. However, parental controls were used by less than 3% of parents in all countries that included this question in the survey (Philippines, South Africa, Albania, Montenegro and Italy). Similarly, findings from EU Kids Online show that while well over half of European parents talk to their children about their internet use, far fewer said they use parental controls − 11% of parents of 9- to 16-year-olds use parental controls in Lithuania, rising to 24% in Germany and around one-third in Norway, Poland and Spain (Šmahel et al., Citation2020).

The legacy of first-generation parental controls may cast a long shadow. With names like CyberSnoop, tools that “spy” on everything the child does online, and clumsy filters that prevented access to content from Essex or sex education materials, the ethics and effectiveness of such tools have been widely questioned. Indeed, Third et al. (Citation2019) regard parental controls as emblematic of a wider “control paradigm” which, like other technologically determinist mindsets, conjures an image of children at risk, with risk itself perceived “in terms of slippery slopes [and] worst-case scenarios” (p. 44). Meanwhile, concerns are often expressed, with empirical grounds, that even if parental controls are used, children will find technical workarounds to access “forbidden fruit” while parents may be lulled into a false sense of security (Geržičáková et al., Citation2023), beguiled by businesses’ persuasive marketing claims (Lupton et al., Citation2021).

Notwithstanding considerable fear-mongering and parental anxieties about technology (Modecki et al., Citation2022), parents seem sceptical about embracing the control paradigm. Analysis of the EU Kids Online survey in Norway, Germany, Poland, Spain and Russia found that, on the one hand, between half and three-quarters of parents, depending on the country, thought parental controls would help them feel more in control and that their child would be safer online. On the other hand, between a third and two-thirds did not know how or, more importantly, whether, the tools would work. Such parental scepticism may be well-founded: between 2013 and 2016, the European Commission Directorate-General for Communications Networks, Content and Technology supported regular independent evaluations of end-user filtering systems, finding them better at preventing access to pornography than other contents, poor in languages other than English, and with a worrying rate of over-blocking “innocent” content (SIP-Bench, Citation2018). Most do not address the full range of online risk (encompassing content, contact, conduct and contract risks of harm to users; see Livingstone & Stoilova, Citation2021). Moreover, the language remains predominantly that of control (McGinn, Citation2022), notwithstanding the importance of parent – child conversations, ensuring children are aware of how parents use the controls and supporting children to learn to make their own decisions. Research also supports parents’ prioritisation of interpersonal relations: a review of parental mediation research found that what matters to children’s experience of online risk is the warmth of the child – parent relationship and the collaborative and communicative actions this enables more than any use of technical tools, surveillance or restrictions (Elsaesser et al., Citation2017).

Nonetheless, from the perspective of policymakers, parents can seem hard to reach and unreliable in meeting their responsibilities, jeopardising the broader ecosystem of laws, regulations, business innovation and educational initiatives that, taken together, are designed to ensure children’s wellbeing in relation to digital technologies. However, from the families’ perspective, many other factors come into play. Notably, the appropriation of technologies in everyday life involves an active process of meaning-making heavily shaped by interpersonal relationships, cultural values and imaginaries, and material circumstances (Chambers, Citation2016; Hartmann, Citation2005; Silverstone & Hirsch, Citation1992). This results not only in a diversity of approaches, but also inequalities in parental competence and resources. Consequently, governments and businesses should not anticipate a straightforward or uniform adoption of either digital technologies or the practices recommended to manage them. So what can we learn from the evidence on the use and outcomes of parental control tools that can help fulfil societal expectations for child online protection?

Methods

We conducted a systematic evidence review and assessment following the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocol (PRISMA-P) guidelines (Moher et al., Citation2015) to answer two research questions:

RQ1: Which families use parental controls and why?

RQ2: What are the outcomes, beneficial or otherwise, of using parental controls?

Search strategy

We searched five multidisciplinary and subject-specific databases on 29 March 2021 for empirical research studies, secondary analysis and evidence reviews concerned with children’s and/or parents’/caregivers’ experiences of parental controls published in English from 2010 to 2021 (the decade prior to the search) (see ) as part of a European Commission-funded project (euCONSENT) on technical measures for ensuring age verification and parental consent. Preliminary experimentation with search terms found few studies on parental consent (concerning children’s digital activities) but many on parental controls. It was therefore decided to broaden the search to include parental controls.

Figure 1. PRISMA flow diagram for selecting studies to be reviewed.

Figure 1. PRISMA flow diagram for selecting studies to be reviewed.

The final search combined four groups of terms – relating to age (e.g., age verification, age check, age-based), children (e.g., child, school student, kid), parental controls and consent (e.g., content monitoring, parental lock, Net Nanny), and the digital environment (e.g., internet, online, digital, apps, social media, Minecraft). To supplement the systematic review, we asked more than 80 subject experts across Europe and North America for their recommended sources, and conducted supplementary searches to identify relevant research reports (for details, see Smirnova et al., Citation2021).

Selection process

Of the 1,656 results identified through the database searches, 1,160 remained after de-duplication. These were screened first based on the abstract and then on reading the full text by applying five eligibility criteria concerned with the relevance and quality of the study (see and Smirnova et al., Citation2021). For example, with regards to relevance we excluded studies that did not include children or parents in their samples or that did not study the experiences of families with parental controls (e.g., empirical technical work on the development and testing of tools was excluded). In relation to study quality, we applied a criterion of methodological rigour, removing studies that, for example, lacked sufficient details for evaluation or replication. For instance, studies were excluded if the research design was unclear, the quality of research was difficult to evaluate, or the links between conceptual and methodological designs and conclusions were hard to understand. For this article, we also excluded the studies that did not discuss parental control tools (i.e., focused on age-verification methods) and those lacking direct evidence on the use or outcomes of parental controls.

This left 40 studies included in the analysis. Most of these studies were conducted in North America (n = 20) and Europe (n = 14), a few in Asia (n = 5) and fewer (n = 4) in other continents.Footnote1 They were coded according to findings relating to (1) the context of the use of parental controls (see ) and (2) the reported outcomes of using parental controls (see ). These codes were developed bottom-up based on identification of key themes in the study findings and grouping them in meaningful clusters. Qualitative and quantitative studies were coded similarly and given equal importance in the analysis and discussion. In all, 14 studies addressed the context of use, 30 the outcomes of use, and 4 both (see the Appendix).

Table 1. Factors related to the use of parental controls.

Table 2. Types of outcomes from the use of parental controls.

Results

Which families use parental controls and why?

Of the 40 studies in the review, 14 addressed different factors related to the context of the use of parental controls. We divided these into four themes: six studies discussed motivations for use grounded in beliefs about child development and family needs, four linked the use of parental controls with parental digital skills, seven with risk aversion and safety, and seven with parenting values and parental mediation. Multiple factors were coded for the same study where relevant (see ).

The first group of studies showed that the use of parental controls reflects the needs of the child or the family. For example, surveys conducted in diverse countries found that parents of younger children are more likely to use parental control software and settings (4, 24, 33, 35, 36), as are parents with more children, possibly to limit children’s access to age-inappropriate content (11).

Parents with greater digital skills are also more likely to use parental controls. Skills may, in turn, be linked to parental age: a 2016 Pew survey of parents in the U.S.A. found that younger parents, who tend to be more technology-savvy, were more likely to use technical measures to control internet use (4). A US-based qualitative study showed that parents with more technical expertise used parental controls as a monitoring tool. At the same time, those with lower digital skills lacked knowledge of automated tools and engaged more in personal monitoring (5). Relatedly, a survey of Dutch parents of children aged 6–9 found that parental controls are used more by parents who are more confident internet users, although also by those with lower education (36). Use also depends on awareness: a UK survey on adults’ awareness of the safety measures provided by video-sharing platforms found that only 4 in 10 adult users were aware of these unprompted, and nearly as many were confused, claiming knowledge of a “dummy” measure (29). Children with lower digital skills were also less aware of filtering and monitoring tools (11).

Seven studies linked the use of parental controls to parenting values and parental mediation practices. Instead of a simple “plug-and-play” scenario, parental controls are integrated into how families negotiate technology use. Notably, parents who want to be more involved in their children’s online activities and those who think they can benefit the quality of their digital engagement used control tools more (27, 28, 36). Parental controls play a role in family negotiations of screen time: a qualitative study of 10 children aged 2–6 and their parents from South Korea and the U.S.A. found the use of technical settings to be more successful when the family jointly sets them, as this helped children to follow the rules (33). A Spanish and a US survey found that parents with a more authoritarian parenting style (more rules, granting less autonomy to their children) use parental controls more often (15, 23). By contrast, a qualitative US-based study showed that parental worries about privacy and autonomy were related to more “hands-off” parental mediation and avoidance of parental controls (11).

Finally, parental motivations for using parental controls are often centred on risk avoidance, as demonstrated by methodologically diverse studies (1, 6, 9, 27, 28, 31, 36). For example, nearly half of the Spanish parents of children aged 6–9 surveyed in one study said they would start using filters if their child was cyberbullied (31). This finding is echoed by parental surveys in the Netherlands and Saudi Arabia, showing that parental perceptions of risk severity and children’s vulnerability increase the intention to use parental controls (1, 36). A qualitative study (27), a Belgian experimental co-design study (28), a content analysis of app reviews online (9) and a multimethod in-school study in Australia (6) all linked safety concerns to the desire to use parental controls to protect their child.

Outcomes of using parental controls

Since both parents and the wider society – governments, regulators and industry – place considerable hopes in the benefits that parental controls could bring to families, especially in preventing digitally facilitated harms, we examine the research on outcomes next. Do parental controls meet such safeguarding expectations? Of the 40 studies in the review, 30 report any kind of outcomes of parent controls use. These outcomes can be classified thus:

  • Beneficial outcomes: studies claim a protective effect of parental controls (e.g., reduced risks such as cyberbullying) or enabling outcomes (children better controlling their digital use).

  • No change in outcomes: studies show no effect or overall inefficiency (e.g., tolls are easy to bypass).

  • Limiting outcomes: studies find that parental controls reduce beneficial opportunities (e.g., less information-seeking or lower digital skills among children).

  • Adverse outcomes: studies show that outcomes can be counter-productive or harmful (e.g., increased family conflict).

These thematic groups were again identified inductively from the study findings. The results were somewhat mixed and contradictory. Seventeen studies reported beneficial outcomes, and 12 showed no change in outcomes; limiting outcomes were reported by 6 studies and adverse outcomes by 8; multiple outcomes were coded for the same study where relevant (see ).

Before reviewing these findings, we caution that 20 of the 30 studies conducted surveys, thus supporting correlational but not causal analysis. While we follow the claims of study authors in labelling outcomes of the use of parental controls, it should be kept in mind that the data underpinning these claims are cross-sectional. The corpus includes one experimental study (13) and three longitudinal studies (6, 13, 25), and these offer stronger support that the use of parental controls has the consequences claimed. However, the remaining studies use qualitative (interview-based) research methods, with some content analysis (of app reviews or online comments). While these add depth to our understanding, claimed outcomes are inevitably shaped by self-reported parental or child perceptions.

Of the 17 studies that reported beneficial outcomes, most concerned the reduction in exposure to various types of online risks (n = 12): four studies showed that the use of parental controls was associated with lower access to pornographic or sexual content (6, 8, 17, 32Footnote2); and further studies showed a lower likelihood of cyberbullying victimisation (3), cyber-aggression (34), problematic online gaming (7), illegal downloading (37), screen time (33), privacy risks (39), exposure to unhealthy weight loss material, physical harm, violent images and hate messages (35), and exposure to age-inappropriate gaming content and misleading videos or advertising (30). Some also linked the use of parental controls to positive practices such as improved time management, parent – child communication, or identification of when children need help.

Not only are the effect sizes reported by these studies generally small, but the operation of multiple other factors also qualifies any simple conclusion that parental controls bring benefits. For example, while a multimethod Australian study found that the children saw much less pornographic content after the parental control app was used, it also included in-school media literacy training and increased disciplinary consequences for viewing pornography on the school network (6). Similarly, a European survey found that the use of filtering tools, while associated with lower exposure to sexual content online, varied by country and showed very small effect sizes (32). Third, a Spanish survey showed not only a very small effect of parental controls on cyberbullying victimisation, but also that this was mediated by impulsivity and high-risk internet behaviours; moreover, by comparison with parental supervision, the use of parental controls was less effective (3).

The second group of studies reported no change in outcomes, including null effects on risk reduction (12, 22, 26, 32, 35, 39). For example, a US study of adolescent girls who had been maltreated online found that safety-focused parental mediation reduced unintentional and intentional exposure to sexual content, offline meetings and sexual solicitations, yet the use of parental control software alone had no such effect (13). Some of the “null effect” studies documented how easily children can bypass parental controls. More positively, this group of studies includes those that found no negative impact on children’s online opportunities using parental controls. For example, a US survey found no evidence that the use of parental controls limited children’s engagement in online activities (38). In addition, a Spanish survey found that using parental controls had no adverse effect on children’s perception of family support (23).

However, six studies did suggest that the use of parental controls limits children’s online opportunities – being linked to lower overall internet use (13), reduced privacy and autonomy for children online (14), and restricted access to beneficial online activities (8, 14), especially children’s freedom to socialise online (8, 14, 30, 39). Also noteworthy, a Russian survey found that the use of parental controls was linked to children’s lower digital competence – including less knowledge about the internet and internet safety and lower digital skills (35).

Finally, based on a mix of content analysis and qualitative methods, eight studies point to adverse outcomes. Both children and parents express concerns that using parental controls can increase family conflict, erode trust, reduce child autonomy and invade their privacy (2, 9, 11, 14, 17). In addition, interviews with 14- to 18-year-olds and their parents revealed experiences of false identification and over-blocking of “innocent” content (10). Finally, a Latvian longitudinal study found that parental controls were a risk factor for developing “compulsive internet use” one year later, which the authors interpreted as a consequence of overly strict parenting (25). The same study found that establishing rules for internet use and maintaining a positive parent – child relationship was associated with reduced risk of “compulsive internet use.”

Discussion and conclusions

Many stand-alone and device-, network- or app-based systems of parental controls exist. They serve multiple purposes and are provided as part of a broader service or on subscription. Nevertheless, government promotion of parental controls and industry marketing is only partly supported by the independent research evidence reviewed here. We find that the minority of parents who do use parental controls do so for various reasons. However, since we found no studies that matched the reasons for use with the outcome measures, nor any that compared measures of children’s online experiences before and after using parental controls, with one exception that reported a null effect (13), it cannot be concluded that the evidence supports the claims of tool efficacy. Instead, most research relies on parental perception or satisfaction with improvements following the use of parental controls, and even that shows a mixed pattern of beneficial, null and even adverse results.

Although research does not support the use of parental controls as a stand-alone or “silver bullet” solution, it does suggest that parental controls can be helpful when embedded in the everyday mix of parental mediation practices characteristic of everyday family life (Livingstone et al., Citation2017; Nichols & Selim, Citation2022). The findings regarding the contexts of use are illuminating, revealing parents’ efforts to think about the role of parental controls in their family life in accordance with children’s needs, their interest in and competence to engage with their children’s digital engagement, and their own need for support in navigating risk, screen time and other perceived ills. Indeed, most studies reviewed here examined the context and outcomes of using parental controls as part of a wider investigation of parental mediation. In this regard, our review mirrors the limitations of the field: we found a range of terminology in use (parental controls, parental tools, filters, blocking tools, safety measures, and so forth) but little specificity regarding different kinds of technical features. Altarturi et al. (Citation2020) propose a taxonomy that could be useful for a future examination of the merits or otherwise of different types of parental controls. They distinguish different parental control techniques (browser-based, search engines, monitoring and tracking, screen/app time controls and filtering framework), as well as different types of filtering approach (IP packet-based, URL/DNS-based, keywords-based and content-based). However, since parents themselves may be unaware of the nature of the parental controls they use, it may be difficult in future to examine which kinds of controls are more effective or more appropriate for particular problems or families, and why.

Given that the review found effect sizes for the use of parental controls to be generally small, it would be hard to place much weight on the use of parental controls even as part of a mixed approach, especially since the outcomes include limiting children’s online opportunities and other more negative costs, such as undermining children’s digital skills, agency or privacy. If policymakers are to rely on the widespread and efficacious use of parental controls as part of the broader ecosystem of norms, regulations and laws governing children’s digital lives, a more robust evidence base is needed. This should include experimental testing to allow for causal claims such as using before-and-after designs or control groups. Other evaluation methods should include matching parental expectations of parental controls with their perceived outcomes, checking not only for risk reduction but also unintended limitations on opportunities (such as limiting access to important health and sexuality content; see Wareham, Citation2022), and controlling for parental age and digital skills.

More positively, the reviewed research points to the importance of considering the needs of both children and parents in the design, marketing and use of parental controls. While the studies that included children’s and parents’ perspectives revealed conflicting accounts of parental controls and evidence of conflicts resulting from parental control use, the few co-design studies conducted with parents and children resulted in the production of better tools (21, 27, 28). By contrast, parental controls that rely on privacy-invasive techniques, authoritarian rule setting or strict measures tend to be ineffective and are not viewed positively by parents or children. Granting almost exclusive power to parents and prioritising restriction over communication can result in missed opportunities for children to learn about online risks, develop coping skills and negotiate their specific needs with parents. Since not all online risks result in harm, such opportunities are essential for child development (FOSI Family Online Safety Institute, Citation2021; Livingstone, Citation2013). In addition, measures that children see as too restrictive or invasive can lead to the erosion of trust within the family.

Hence, including children’s views in developing and applying parental controls is an excellent way to ensure the measures are as effective as possible. This crucial point is not always reflected in regulatory or business circles, but it is gaining recognition among researchers, as well as parents and children themselves. While parental controls tend to be advertised for their restrictive and controlling properties, many studies emphasise the positive role of open communication, joint rule setting, negotiation and child involvement in decision-making. For example, parental mediation may be viewed as an opportunity to co-learn (Ko et al., Citation2015) and to recognise children’s agency (Martínez et al., Citation2020). As Ghosh et al. (Citation2018, p. 8) concluded, regarding their study of app reviews, “we found that child reviews were more positive when they felt that the apps afforded them more agency (i.e., self-regulation) or improved their relationship with their parents (i.e., active mediation).”

In short, the best outcomes for children occur when parents integrate parental controls as part of positive parenting centred on open communication and respectful negotiation within the family. While further research is required to be confident of this conclusion, it accords with the child rights framework that is also gaining attention among policymakers worldwide (Hartung, Citation2020; Lievens et al., Citation2018; UN United Nations, Citation2021). In advocating a holistic approach to children’s rights concerning the digital environment, this framework prioritises children’s voices and best interests, recognising parental responsibilities without overburdening them with problems better addressed by businesses or the government.

Research also points to positive implications for the future design and promotion of parental controls: children should be consulted during tool development and regarding their use. Tool functionality should take account of both online risks and the opportunities, and evaluation of their use should be holistic. Specifically, parental controls should promote children’s agency and development, safety and privacy, and online opportunities. The marketing of such tools should not trade on parental anxieties but instead, parental desires to integrate digital technologies within family life fairly and inclusively. Finally, policymakers should not rely on using parental controls unduly to relieve the responsibilities of businesses or the state to ensure children’s safety in the digital world.

Acknowledgments

We would like to thank our colleagues from the euCONSENT project, the British Library of Political and Economic Science and all the experts who advised us on the evidence review, as well as the anonymous reviewers of an earlier version of this manuscript.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This research was funded by a grant from the European Commission: PPPA-AGEVER-01-2020 (project number LC-01622116/101018061).

Notes on contributors

Mariya Stoilova

Mariya Stoilova is a Postdoctoral Research Officer in the Department of Media and Communications, London School of Economics and Political Science (LSE). Her work lies at the intersection of child rights and digital technology, focusing in particular on online opportunities and risks, datafication and privacy, digital skills, mental health, and pathways to harm and wellbeing. Mariya’s work incorporates multimethod evidence generation and cross-national comparative analyses.

Monica Bulger

Monica Bulger is a Research Affiliate in the Creative Communities Research Group, at the University of Colorado Boulder. She studies youth and family technology use and advises policy globally. She has consulted on child online protection for UNICEF and Global Kids Online since 2012. Her research encompasses 16 countries in Asia, the Middle East, North Africa, South America, North America and Europe.

Sonia Livingstone

Sonia Livingstone, DPhil (Oxon), OBE, FBA, FBPS, FAcSS, FRSA, is a Professor of Social Psychology in the Department of Media and Communications at the London School of Economics and Political Science (LSE). She has published 20 books, including Parenting for a digital future: How hopes and fears about technology shape children’s lives. She is working on a series of European Commission and UK Research and Innovation -funded projects concerned with children’s opportunities, risks and rights in a digital world.

Notes

1. Some studies included more than one country and were counted each relevant area.

2. In study 32 the beneficial effect was modest, accounting for less than 0.5% of the variability observed in the EU data. No effect was found in the UK.

References

  • *Alelyani, T., Ghosh, A. K., Moralez, L., Guha, S., Wisniewski, P., & Meiselwitz, G. (2019). Examining parent versus child reviews of parental control apps on google play. 11th International Conference on Social Computing and Social Media, SCSM 2019, London, 11579, 3–21
  • *Al-Naim, A. B., & Hasan, M. M. (2018). Investigating Saudi parents’ intention to adopt technical mediation tools to regulate children’s internet usage. International Journal of Advanced Computer Science & Applications, 9(5), 456–464.
  • Altarturi, H. H. M., Saadoon, M., & Anuar, N. B. (2020). Cyber parental control: A bibliometric study. Children and Youth Services Review, 116, 105134. https://doi.org/10.1016/j.childyouth.2020.105134
  • *Álvarez-García, D., Nunez, J. C., Gonzalez-Castro, P., Rodriguez, C., & Cerezo, R. (2019). The effect of parental control on cyber-victimization in adolescence: The mediating role of impulsivity and high-risk behaviors. Frontiers in Psychology, 10(7), 1–7. https://doi.org/10.3389/fpsyg.2019.01159
  • *Anderson, M. (2016). Parents, teens and digital monitoring. Pew Research Center. http://www.pewresearch.org/wp-content/uploads/sites/9/2016/01/PI_2016-01-07_Parents-Teens-Digital-Monitoring_FINAL.pdf
  • *Badillo-Urquiola, K., Page, X., & Wisniewski, P. (2019). Risk vs. restriction: The tension between providing a sense of normalcy and keeping foster teens safe online. CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019). Glasgow, Scotland, UK.
  • *Bate, F., MacNish, J., Males, S., Chova, L. G., Torres, I. C., & Martinez, A. L. (2012). Managing student distraction: Responding to problems of gaming and pornography in a Western Australian school for boys. The University of Notre Dame Australia School of Education. https://researchonline.nd.edu.au/cgi/viewcontent.cgi?article=1053&context=edu_conference
  • *Benrazavi, R., Teimouri, M., & Griffiths, M. D. (2015). Utility of parental mediation model on youth’s problematic online gaming. International Journal of Mental Health and Addiction, 13(6), 712–727. https://doi.org/10.1007/s11469-015-9561-2
  • Bertrandias, L., Bernard, Y., & Elgaaied-Gambier, L. (2023). How using parental control software can enhance parents’ well-being: The role of product features on parental efficacy and stress. Journal of Interactive Marketing, 58(2–3), 280–300. https://doi.org/10.1177/10949968221144270
  • Billinge, G., Burgess, R., & Corby, I. (2021). EU methods for Audiovisual media services Directive (AVMSD) and General data protection regulation (GDPR) compliance. Age Verification Providers Association.
  • Bulger, M., Burton, P., O’Neill, B., & Staksrud, E. (2017). Where policy and practice collide: Comparing United States, South African and European Union approaches to protecting children online. New Media & Society, 19(5), 750–764. https://doi.org/10.1177/1461444816686325
  • Chambers, D. (2016). Changing media, homes and households: Cultures, technologies and meanings. Routledge.
  • *Chandrima, R. M., Kircaburun, K., Kabir, H., Riaz, B. K., Kuss, D. J., Griffiths, M. D., & Mamun, M. A. (2020). Adolescent problematic internet use and parental mediation: A Bangladeshi structured interview study. Addictive Behaviors Reports, 12, 100288. https://doi.org/10.1016/j.abrep.2020.100288
  • *Cino, D., Mascheroni, G., & Wartella, E. (2020). “The kids hate it, but we love it!”: Parents’ reviews of circle. Media and Communication, 8(4), 208–217. https://doi.org/10.17645/mac.v8i4.3247
  • Clark, L. S. (2013). The parent app: Understanding families in the digital age. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199899616.001.0001
  • *Cranor, L. F., Durity, A., Marsh, A., & Ur, B. (2014). Parents’ and teens’ perspectives on privacy in a technology-filled world. Symposium on Usable Privacy and Security (SOUPS), July 9–11. Menlo Park, CA.
  • Elsaesser, C., Russell, B., McCauley Ohannessian, C., & Patton, D. (2017). Parenting in a digital age: A review of parents’ role in preventing adolescent cyberbullying. Aggression and Violent Behavior, 35, 62–72. https://doi.org/10.1016/j.avb.2017.06.004
  • *Erickson, L. B., Wisniewski, P., Xu, H., Carroll, J. M., Rosson, M. B., & Perkins, D. F. (2016). The boundaries between: Parental involvement in a teen’s online world. Journal of the Association for Information Science and Technology 67, 1384–1403. https://doi.org/10.1002/asi.23450
  • FOSI (Family Online Safety Institute). (2021). Managing the narrative: Young people’s use of online safety tools. 2021 research report. FOSI. https://global-uploads.webflow.com/5f47b99bcd1b0e76b7a78b88/618d32fb1c370900fcd08ab0_FOSI%20Research%20Report%202021.pdf
  • *Fuertes, W., Quimbiulco, K., Galárraga, F., García-Dorado, J. L. (2015a). On the development of advanced parental control tools. 1st International Conference on Software Security and Assurance (ICSSA), 1–6. Suwon, Korea (South). https://doi.org/10.1109/ICSSA.2015.011
  • *Gallego, F., Malamud, O., & Pop-Eleches, C. (2020). Parental monitoring and children’s internet use: The role of information, control, and cues. Journal of Public Economics, 188, 1–18. https://doi.org/10.1016/j.jpubeco.2020.104208
  • Geržičáková, M., Dedkova, L., & Mýlek, V. (2023). What do parents know about children’s risky online experiences? The role of parental mediation strategies. Computers in Human Behavior, 141, 1–9. https://doi.org/10.1016/j.chb.2022.107626
  • *Ghosh, A. K., Badillo-Urquiola, K., Guha, S., LaViola, J. J., & Wisniewski, P. J. (2018). Safety vs. surveillance: What children have to say about mobile apps for parental control. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), Paper 124, 1–14. New York, USA. http://www.eecs.ucf.edu/~jjl/pubs/pn1838-ghoshA.pdf
  • *Ghosh, A. K., Badillo-Urquiola, K., Rosson, M. B., Xu, H., Carroll, J. M., & Wisniewski, P. J. (2018). A matter of control or safety? Examining parental use of technical monitoring apps on teens’ mobile devices. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), Paper 194, 1–14. New York, USA. https://doi.org/10.1145/3173574.3173768
  • *Ghosh, A. K., Hughes, C. E., & Wisniewski, P. J. (2020). Circle of trust: A new approach to mobile online safety for gamilies. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI 20), 1–14. New York, USA. https://doi.org/10.1145/3313831.3376747
  • *Ghosh, A. K., & Wisniewski, P. (2016). Understanding user reviews of adolescent mobile safety apps: A thematic analysis. Proceedings of the 2016 ACM (Association for Computing Machinery) International Conference on Supporting Group Work (GROUP 16), 417–420. New York, USA. https://doi.org/10.1145/2957276.2996283
  • *Hartikainen, H., Iivari, N., & Kinnula, M. (2016). Should we design for control, trust or involvement? A discourses survey about children’s online safety. Proceedings of the 15th International Conference on Interaction Design and Children (IDC 16), 367–378. New York, USA. https://doi.org/10.1145/2930674.2930680
  • Hartmann, M. (2005). The discourse of the perfect future: Young people and new technologies. In R. Silverstone (Ed.), Media, technology and everyday life in Europe (pp. 141–158). Ashgate.
  • Hartung, P. (2020). Children’s rights-by-design: A new standard for data use by tech companies. UNICEF. http://www.unicef.org/globalinsight/reports/childrens-rights-design-new-standard-data-use-tech-companies
  • *Hashish, Y., Bunt, A., & Young, J. E. (2014). Involving children in content control: A collaborative and education-oriented content filtering approach. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 14), 1797–1806. Toronto, ON. https://doi.org/10.1145/2556288.2557128
  • *Hundlani, K., Chiasson, S., & Hamind, L. (2017). No passwords needed: The iterative design of a parent–child authentication mechanism. Mobile HCI 2017, 4–7 September. https://chorus.scs.carleton.ca/wp-content/papercite-data/pdf/hundlani2017kindersurf-mobilehci.pdf
  • ITU (International Telecommunication Union). (2020). Guidelines for parents and educators on child online protection 2020. International Telecommunication Union Development Sector. http://www.itu.int/dms_pub/itu-s/opb/gen/S-GEN-COP.EDUC-2020-PDF-E.pdf
  • *Ko, M., Choi, S., Yang, S., Lee, J., & Lee, U. (2015). FamiLync: Facilitating participatory parental mediation of adolescents’ smartphone use. Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 15), 867–878. New York, USA.
  • *Law, D. M., Shapka, J. D., & Olson, B. F. (2010). To control or not to control? Parenting behaviours and adolescent online aggression. Computers in Human Behavior, 26(6), 1651–1656. https://doi.org/10.1016/j.chb.2010.06.013
  • Lievens, E., Livingstone, S., McLaughlin, S., O’Neill, B., & Verdoodt, V. (2018). Children’s rights and digital technologies. In T. Liefaard & U. Kilkelly (Eds.), International children’s rights law (pp. 487–513). Springer. http://eprints.lse.ac.uk/84871
  • Livingstone, S. (2013). Online risk, harm and vulnerability: Reflections on the evidence base for child internet safety policy. ZER: Journal of Communication Studies, 18(35), 13–28. http://eprints.lse.ac.uk/62278
  • Livingstone, S., & Blum-Ross, A. (2020). Parenting for a digital future: How hopes and fears about technology shape children’s lives. Oxford University Press.
  • Livingstone, S., Ólafsson, K., Helsper, E. J., Lupiáñez-Villanueva, F., Veltri, G. A., & Folkvord, F. (2017). Maximising opportunities and minimising risks for children online: The role of digital skills in emerging strategies of parental mediation. Journal of Communication, 67(1), 82–105. https://doi.org/10.1111/jcom.12277
  • Livingstone, S., & Stoilova, M. (2021). The 4Cs: Classifying online risk to children. Leibniz-Institut für Medienforschung. https://doi.org/10.21241/ssoar.71817
  • Lupton, D., Pink, S., & Horst, H. (2021). Living in, with and beyond the ‘smart home’. Convergence, 27(5), 1147–1154. https://doi.org/10.1177/13548565211052736
  • *Martínez, G., Casado, M. Á., & Garitaonandia, C. (2020). Online parental mediation strategies in family contexts of Spain. Comunicar, 28(65), 65–73. https://doi.org/10.3916/C65-2020-06
  • Mauk, M. (2021). Think of the parents: Parental controls in digital TV and family implications. In D. Holloway, M. Willson, K. Murcia, C. Archer, & F. Stocco (Eds.), Young children’s rights in a digital world: Play, design and practice (pp. 81–92). Springer International Publishing.
  • Mauk, M. (2022). Parenting and the algorithm: A perspective on parental controls and guilt amid digital media. In M. Ito, R. Cross, K. Dinakar, & C. Odgers (Eds.), Algorithmic rights and protections for children (pp. 35–42). MIT Publishing.
  • McGinn, M. (2022, January 26). An age-based guide to parental controls and internet safety for kids. Verizon News Center. http://www.verizon.com/about/parenting/age-based-guide-parental-controls-and-internet-safety-kids
  • *McNally, B., Kumar, P., Hordatt, C., Mauriello, M. L., Naik, S., Norooz, L., Shorter, A., Golub, E., & Druin, A. (2018). Co-designing mobile online safety applications with children. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI 18), 1–9. Montréal, QC, Canada.
  • Milosevic, T., Livingstone, S. (2018). Protecting children online? Cyberbullying policies of social media companies. The MIT Press. https://doi.org/10.7551/mitpress/9780262037099.001.0001
  • *Miltuze, A., Sebre, S. B., & Martinsone, B. (2020). Consistent and appropriate parental restrictions mitigating against children’s compulsive internet use: A one-year longitudinal study. Technology, Knowledge & Learning, 26(4), 883–895. https://doi.org/10.1007/s10758-020-09472-4
  • Modecki, K. L., Goldberg, R. E., Wisniewski, P., & Orben, A. (2022). What is digital parenting? A systematic review of past measurement and blueprint for the future. Perspectives on Psychological Science, 17(6), 1673–1691. https://doi.org/10.1177/17456916211072458
  • Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., Stewart, L. A., & Group, P.-P. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1), 1. https://doi.org/10.1186/2046-4053-4-1
  • Nichols, S., & Selim, N. (2022). Digitally mediated parenting: A review of the literature. Societies, 12(2), 60. https://doi.org/10.3390/soc12020060
  • *Noll, J. G., Shenk, C. E., Barnes, J. E., & Haralson, K. J. (2013). Association of maltreatment with high-risk internet behaviors and offline encounters. Pediatrics, 131(2), E510–E517. https://doi.org/10.1542/peds.2012-1281
  • *Nouwen, M., JafariNaimi, N., & Zaman, B. (2017). Parental controls: Reimagining technologies for parent–child interaction. Proceedings of 15th European Conference on Computer-Supported Cooperative Work – Exploratory Papers, 2017, 18–34. European Society for Socially Embedded Technologies (EUSSET). https://dl.eusset.eu/server/api/core/bitstreams/87965281-4650-463f-bf51-de31eb7ea6fd/content
  • *Nouwen, M., Van Mechelen, M., & Zaman, B. (2015b). A value sensitive design approach to parental software for young children. Proceedings of the 14th International Conference on Interaction Design and Children (IDC 15), 363–366. New York, USA. https://doi.org/10.1145/2771839.2771917
  • *Ofcom, & Yonder. (2021). User experience of potential online harms within video sharing platforms. Ofcom. http://www.ofcom.org.uk/__data/assets/pdf_file/0024/216492/yonder-report-experience-of-potential-harms-vsps.pdf
  • O’Neill, B., Dreyer, S., & Dinh, T. (2020). The Third Better Internet for Kids Policy Map: Implementing the European Strategy for a Better Internet for Children in European Member States. https://www.betterinternetforkids.eu/bikmap
  • *Pavan Kumar Attavar, S., & Rani, P. (2018). How children under 10 years access and use digital devices at home and what parents feel about it: Insights from India. Global Media Journal: Indian Edition, 10(1), 1–25. .
  • Perspective Economics and DCMS (Department for Digital, Culture, Media and Sport). (2021). The UK Safety Tech Sector: 2021 Analysis. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/989753/UK_Safety_Tech_Analysis_2021_-_Final_-_190521.pdf
  • *Pons-Salvador, G., Zubieta-Mendez, X., & Frias-Navarro, D. (2018). Internet use by children aged six to nine: Parents’ beliefs and knowledge about risk prevention. Child Indicators Research, 11(6), 1983–2000. https://doi.org/10.1007/s12187-018-9529-4
  • *Przybylski, A., & Nash, V. (2018). Internet filtering and adolescent exposure to online sexual material. Cyberpsychology, Behavior, and Social Networking, 21(7), 405–410. https://doi.org/10.1089/cyber.2017.0466
  • *Seo, H., & Lee, C. S. (2017). Emotion matters: What happens between young children and parents in a touchscreen world. International Journal of Communication, 11, 561–580. https://ijoc.org/index.php/ijoc/article/viewFile/4233/1919
  • *Shapka, J. D., & Law, D. M. (2013). Does one size fit all? Ethnic differences in parenting behaviors and motivations for adolescent engagement in cyberbullying. Journal of Youth and Adolescence, 42(5), 723–738. https://doi.org/10.1007/s10964-013-9928-2
  • Silverstone, R., & Hirsch, E. (1992). Consuming technologies: Media and information in domestic spaces. Routledge.
  • SIP-Bench, III. (2018). Benchmarking of parental control tools for the online protection of children – final report. Publications Office. https://doi.org/10.2759/80227
  • Šmahel, D., Machackova, H., Mascheroni, G., Dedkova, L., Staksrud, E., Ólafsson, K., Livingstone, S., & Hasebrink, U. (2020). EU Kids online 2020: Survey results from 19 countries. EU Kids Online. https://eprints.lse.ac.uk/103294
  • Smirnova, S., Livingstone, S., & Stoilova, M. (2021). Understanding of user needs and problems: A rapid evidence review of age assurance and parental controls. euConsent. http://eprints.lse.ac.uk/112559
  • Smirnova, S., Stoilova, M., & Livingstone, S. (2021). Age assurance and parental control tools in everyday life: A rapid evidence review methodology. London School of Economics and Political Science. https://osf.io/mdgk8
  • *Soldatova, G. U., Rasskazova, E. I., & Chigarkova, S. V. (2020). Digital socialization of adolescents in the Russian Federation: Parental mediation, online risks, and digital competence. Psychology in Russia: State of the Art, 13(4), 191–206. https://doi.org/10.11621/pir.2020.0413
  • *Sonck, N., Nikken, P., & de Haan, J. (2013). Determinants of internet mediation: A comparison of the reports by Dutch parents and children. Journal of Children and Media, 7(1), 96–113. https://doi.org/10.1080/17482798.2012.739806
  • Staksrud, E. (2016). Children in the online world: Risk, regulation, rights. Routledge.
  • Third, A., Collin, P., Walsh, L., & Black, R. (2019). Control shift: Young people in digital society. Palgrave Macmillan UK. https://doi.org/10.1057/978-1-137-57369-8
  • *Tomczyk, L., Ryk, A., & Prokop, J. (2018). Digital piracy among adolescents – scale and conditions. In New trends and research challenges in pedagogy and andragogy (NTRCPA18). Charles University in Prague.*The 40 studies included in the evidence review.
  • UNICEF. (2021). Child protection: Digital opportunities, challenges and innovations across the region. UNICEF Europe and Central Asia Regional Office. www.unicef.org/eca/media/14386/file
  • UNICEF and ITU (International Telecommunication Union). (2020). Guidelines for Industry on Child Online Protection 2020. http://www.unicef.org/media/90796/file/ITU-COP-guidelines%20for%20industry-2020.pdf
  • UN (United Nations) Committee on the rights of the child (2021). General Comment 25 on Children’s Rights in Relation to the Digital Environment (CRC/C/GC/25). www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx
  • *Vaala, S. E., & Bleakley, A. (2015). Monitoring, mediating, and modeling: Parental influence on adolescent computer and internet use in the United States. Journal of Children and Media, 9(1), 40–57. https://doi.org/10.1080/17482798.2015.997103
  • Vickery, J. R. (2017). Worried about the wrong things: Youth, risk, and opportunity in the digital world. The MIT Press.
  • Wareham, J. (2022, January 19). 92% of top parental control apps wrongly block LGBTQ and sex-ed sites. Forbes. http://www.forbes.com/sites/jamiewareham/2022/01/19/92-of-top-parental-control-apps-wrongly-block-lgbtq-and-sex-ed-sites/?sh=67e8f1307844
  • *Wisniewski, P., Jia, H., Xu, H., Rosson, M. B., & Carroll, J. (2015). ‘Preventative’ vs. ‘reactive’: How parental mediation influences teens’ social media privacy behaviors. Proceedings of the 18th ACM (Association for Computing Machinery) Conference on Computer Supported Cooperative Work & Social Computing (CSCW 15), 302–316. New York, USA.
  • *Wisniewski, P. J., Xu, H., Rosson, M. B., & Carroll, J. M. (2014). Adolescent online safety: The ‘moral’ of the story. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW 14), 1258–1271. Baltimore, MD, USA.