1,229
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Systematic review: Characteristics and outcomes of in-school digital media literacy interventions, 2010-2021

&
Pages 8-28 | Received 10 Aug 2022, Accepted 27 Sep 2023, Published online: 30 Oct 2023

ABSTRACT

This systematic review examines characteristics and outcomes of interventions for teaching digital media literacy in the educational system. Despite the development of media technology and the importance ascribed to digital media literacy as one of the critical skills for the 21st century, this study reveals that little research has been carried out evaluating interventions for teaching digital media literacy in schools. The successful intervention outcomes identified in this review include an increased understanding of media content and greater awareness of media influence, a more critical approach to media, increases in feelings of competency and empowerment with regard to media use, increases in digital media content production skills, and reduction in excessive or risky media use. The review finds that more consistent positive outcomes were associated with younger target audiences, the incorporation of a practical component within the intervention, and the extended duration or higher number of sessions of intervention administration. The studies reviewed identified some challenges to achieving successful outcomes in such interventions, namely that media technologies are an intrinsic part of children’s everyday lives today.

Impact Summary

Prior State of Knowledge

Digital media literacy – the critical handling of digital media from traditional to newer digital media – is an essential skill for the 21st Century. Little prior research has summarized knowledge about the outcomes of digital media literacy interventions.

Novel Contributions

This systematic review summarizes recent academic knowledge about in-school digital media literacy interventions. It identifies three important intervention characteristics associated with diverse intervention outcomes, including understanding of, a critical approach toward, and creation of media content.

Practical Implications

The study highlights characteristics of digital media literacy interventions to which practitioners – educators and academics creating and implementing such interventions – should pay careful attention to enhance the success of the interventions in achieving their goals.

Recent years have seen a significant increase in children’s and adolescents’ media use worldwide (Bozzola et al. Citation2022, Ofcom, Citation2022, Rideout et al., Citation2022). Many studies point to links between media use and a wide range of both positive and negative outcomes among young audiences (e.g., Behnamnia et al., Citation2020, Vannucci et al., Citation2020). One of the proposed methods for dealing with the challenges that media pose to young audiences – and for enhancing the positive outcomes of media use – is to foster media literacy among media consumers. In light of the recent rise in digital media use, and of the importance of digital media literacy interventions in providing young audiences with tools to effectively handle this environment, the current study reviews research on in-school digital media literacy interventions to identify their characteristics and outcomes.

The definition of media literacy suffers from a lack of consensus among scholars of this topic (Turin & Friesem, Citation2020). Potter and Thai (Citation2019) found that a fourth of media literacy studies did not even include a definition of this term and the remaining three-fourth covered a wide variation of definitions, with about half providing their own definition, not linked to past research. But there does seem to be a broad base of agreement that media literacy “centers on specific knowledge and skills that can help critical understanding and usage of the media” (Jeong et al., Citation2012, p. 455). Indeed, critical thinking seems to be the core essence of media literacy about which most scholars agree (Xie et al., Citation2019). The concept describes the skills and abilities of audience members to function as active and critically aware media consumers who are cognizant of the target audience of the message, the possible interests behind the message’s creation, and its potential effects on the audience (Vahedi et al., Citation2018).

Some scholars (e.g., Potter, Citation2013) consider media literacy a protective factor, shielding consumers from the dangers of the media. Others (e.g., Hobbs, Citation2011) view the audience as an active entity and focus on empowering the audience through media literacy. Regardless of their approach, most scholars consider the teaching of media literacy a vital process to improving the media audience’s quality of life (Turin & Friesem, Citation2020).

Digital media literacy refers to the critical handling of digital media from traditional forms through newer digital media including computers, the Internet, social networks, and digital games. Martin and Grudziecki (Citation2006) consider digital literacy to include several levels: the ability to think about and understand media and media content, the ability to implement knowledge about media use in practice, and creativity in media use. Digital media literacy includes cognitive aspects of knowledge and understanding, attitudes toward the media, and behaviors manifested through the application of knowledge and through the production of media content. Digital media literacy is among the 21st century skills included in the programs of educational institutions and organizations around the world (Israeli Ministry of Education, Citationn.d.). Studies have shown that digital skills are positively related with online opportunities and technological orientation (Livingstone et al., Citation2021).

Digital media literacy interventions

Scholars agree that media literacy is not an inborn ability, but rather needs to be developed through learning and experience, primarily by putting into practice the individual’s knowledge and abilities (Livingstone et al., Citation2021, Potter, Citation2013). Digital media literacy interventions, as relevant to the present study, are “special treatments designed by researchers to increase some aspect of media literacy … among targets” (Potter, p. 425) that focus on digital media technologies and content. Based on existing definitions (e.g., Martin and Grudziecki, Citation2006, Walther et al., Citation2014), these are efforts to teach audiences critical evaluation of media and media content on digital platforms, including knowledge and awareness of media production and potential effects, digital media content production skills, enhancing feelings of competency and empowerment in dealing with digital media, and the reduction of risky behaviors associated with digital media. Interventions for teaching digital media literacy can be held in both informal and unstructured settings (e.g., at home in a family setting) and in formal settings (e.g., at school). In schools, interventions can be taught through regular courses on the topic, classes about the media in general, classes on subjects unrelated to the media (e.g., art, language) which include components related to literacy, or focused one-time or multiple-session interventions.

Past systematic reviews and meta-analyses detail a host of variables associated with media literacy interventions which are likely to play a role in determining the intervention’s outcomes. The research is often inconsistent with regard to the role of these variables in the intervention’s success. For example, whereas Jeong et al. (Citation2012) reported that interventions with more sessions were more likely to be successful, Johnson et al. (Citation2010) reported a greater success for shorter interventions. Similarly, past research has not been consistent in findings about the role of message slant in intervention success, with some reporting that positive messages had a positive persuasive effect and others reporting the opposite (Jacobson et al., Citation2019, Teng et al., Citation2019). Intervention effect sizes did not vary by the agent administering the intervention (Jeong et al., Citation2012). And, perhaps surprisingly, the age and gender of intervention participants has not been consistently found to associate with intervention effects (Jeong et al., Citation2012, Vahedi et al., Citation2018, Xie et al., Citation2019). Finally, the extent of involvement of parents or other community agents in the intervention has been suggested to be an important construct to examine with regard to the outcomes of media literacy interventions. Parents’ attitudes, monitoring, and relationship with the children have been linked with children’s online use and online privacy strategies (Davis & James, Citation2013, Livingstone et al., Citation2019). However, not enough is known about how these constructs relate to the outcomes of digital media literacy interventions, especially when these are administered in the school context.

Importantly, of the four meta-analyses mentioned above, two examined the effects of interventions on health outcomes but not on media literacy outcomes (i.e., Johnson et al., Citation2010, Xie et al., Citation2019). Two of the meta-analyses examined the effects of interventions on media literacy outcomes (i.e., Jeong et al. [Citation2012] found a moderate effect for interventions among 51 studies; Vahedi et al. [Citation2018] found a small to medium effect size of interventions among 15 studies). We extend these two meta-analyses, as Jeong et al. analyzed studies published until 2009 and Vahedi et al. analyzed studies published until 2016. The current study examines interventions reported in articles published between 2010 and 2021. Also, the studies in the current systematic review focus specifically on in-school digital media literacy interventions.

The present study is a systemic review of research conducted in recent years on the topic of digital media literacy interventions in the formal educational system. It examines the reported characteristics and outcomes of such interventions. Potter (Citation2013) posited that the academic synthesis of all this research would be helpful to “start to build sets of practical guidelines that will be highly useful to instructors” (p. 430). The systematic review will describe existing knowledge in the field as well as which issues are unresolved and which discrepancies exist in the literature. The systematic review aims to generate knowledge-based conclusions regarding the outcomes of such interventions, by considering the specific characteristics of the interventions that are most likely to be associated with positive outcomes for participants.

Research questions

The systematic review asks what are the outcomes of interventions for teaching digital media literacy in the educational system as associated with different intervention characteristics. Within this broad question, the review will aim to test questions about the context of interventions for teaching digital media literacy in the educational system, focusing on variables identified in past research, as reviewed above (e.g., Jeong et al., Citation2012, Vahedi et al., Citation2018, Xie et al., Citation2019):

RQ1: To what extent are interventions successful in achieving different outcomes (e.g., knowledge and awareness of digital media production and media effects, a critical approach toward the media, feelings of competency and empowerment in digital media use, production skills of digital media content, reducing risky digital media behaviors)?

RQ2: What are the outcomes of interventions associated with different participants’ characteristics: (i) age and (ii) gender?

RQ3: What are the outcomes of interventions with different contexts: Intervention that are (i) only theoretical versus interventions that include a media content-creation component and (ii) interventions that include only positive or only negative messages about digital media versus interventions that include both positive and negative messages about digital media?

RQ4: What are the outcomes of interventions administered across different durations and number of sessions?

Method

A systematic review is a methodology that locates and synthesizes empirical knowledge from existing studies following pre-specified criteria for the purpose of answering specific research questions and drawing clear and reliable conclusions about the state of knowledge in a particular field (Denyer & Tranfield, Citation2009, Liberati et al., Citation2009). Systematic reviews have become a more common methodology in recent years, including in the digital technology field (e.g., Navarra et al., Citation2021, See et al., Citation2020) and in the intervention field (Shea et al., Citation2017). As a result, decision makers across diverse fields rely on systematic reviews as a key tool for achieving evidence-based outcomes (Shea et al.). As is customary in systematic reviews (Denyer & Tranfield, Citation2009) and similar to other recent systematic reviews in the media field (e.g., Navarra et al., Citation2021), this study systematically reviewed research on interventions for teaching digital media literacy in the educational system, based on clear, transparent criteria for inclusion of publications in the review.

Article selection

This study reviewed publications (academic articles, doctoral dissertations, and theses) dealing with interventions for teaching digital media literacy in the educational system that were published between 2010 and the end of 2021. The study reviewed empirical articles from Israel and the world in Hebrew and English that documented research on digital interventions within the school system (at the preschool, elementary, middle school, and high school levels) to examine the characteristics and outcomes of interventions for teaching digital media literacy. The study included both qualitative and quantitative articles similar to Bellon-Harn et al. (Citation2020) and Tandon et al. (Citation2021). Not included in this review were articles that did not discuss an intervention (e.g., theoretical articles, literature reviews, or articles containing only recommendations), studies that did not deal with school-aged children or that were not conducted in a school setting, or research on using the media for other purposes (e.g., studying literature or math, encouraging public participation), rather than specifically teaching media literacy.

The academic search engines ProQuest, EbscoHost, and Google Scholar, which contain multiple academic databases, were used to search for articles (see Appendix A). The search was conducted in November-December 2021 and made use of the following search words: Adolescents/Children, Digital Media, Digital Literacy, and Classroom. In addition, a complementary search was conducted using the names of several key scholars in the field, including Renee Hobbs, Sonia Livingstone, James W. Potter, and Erica Weintraub Austin. In addition to using databases, we contacted prominent Israeli researchers in the field and, through them, reached out to additional scholars in order to locate additional publications. describes the full search strategy following the preferred reporting items for systematic reviews and meta-analyses (PSIRMA, Liberati et al., Citation2009). The articles analyzed in the systematic review are noted by an asterisk in the reference list.

Figure 1. Systematic review search strategy, following PRISMA guidelines.

Figure 1. Systematic review search strategy, following PRISMA guidelines.

Article quality assessment and coding

Appendix C details the process of quality assessment of the articles included in the systematic review. The articles were globally assessed by considering four criteria: (a) Clarity of study rationale, including the presentation of a theoretical framework and the importance of digital media literacy interventions, (b) Detailed presentation of the intervention, including a comprehensive introduction of the intervention procedure and materials, (c) Detailed presentation of appropriate methodology to assess the intervention outcomes, including a clear description of the sample, procedure, and measures used, and the extent to which the method and measures are anchored in past research and capable of addressing the research questions, and (d) Detailed reporting of study findings and conclusion, including the presentation of supporting evidence that is clearly linked to the method used.

In coding the articles, both authors examined each article independently and simultaneously. First, the authors examined the abstracts of all articles identified in the search and second, after the final sample for analysis was determined, each author reviewed the articles to analyze the examined variables. Data were recorded in a pre-established table. The authors worked according to a clear but flexible protocol, based on the guidelines suggested by Denyer and Tranfield (Citation2009). In the next stage, the authors discussed the findings to reach consistent conclusions for inclusion in the overall review, in a process similar to that described by Bellon-Harn et al. (Citation2020), Kurz et al. (Citation2022), and See et al. (Citation2020).

The variables considered in coding the material included information about the article (e.g., the type of publication, country), general intervention characteristics (e.g., platform/medium of intervention, content of focus in intervention, central theory guiding the intervention, methodology used for assessing intervention outcomes), possible intervention outcomes and their direction (i.e., understanding of media production and content and awareness of potential media influence, critical approach to media, feelings of empowerment/efficacy in digital media use, digital media production skills, risky media-related behaviors), participants’ characteristics (i.e., age and gender), intervention context (i.e., theoretical-only versus interventions that include a content-creation component, positive, negative, or mixed-valence intervention messages about the media), intervention logistics (e.g., number of intervention sessions, intervention duration) (see Appendix B for the complete coding protocol).

Results

summarizes all of the findings from the articles included in the systematic review and serves as the basis for the reported results in this section.

Table 1. Summary of findings from articles in the systematic review.

General article and intervention information

When considering the quality assessment criteria applied to the articles, 15 of the 17 articles met the criteria for acceptable and even high quality despite some articles including small samples, a lack of a clear theoretical framework, or missing methodological details. Only with regard to two articles in the review is the certainty of evidence lower due to a poor presentation of intervention or methodological detail and less substantiated conclusions based on the presented supportive evidence. These two articles were maintained in the review because they are clearly focused on the review’s topic, digital media literacy in the classroom.

Eleven of the interventions were conducted in North America, most in the United States and also Canada. The other studies were each conducted in a different country. Most interventions dealt with news, advertisements, or narratives in the media. Media technologies chiefly studied or used to support the intervention’s messages were the Internet (including blogs and virtual communities), movies, and computers (including games and software such as Scratch).

Six of the articles did not mention any theoretical framework for the interventions, whereas almost every one of the other studies was based on a different model from the field of communication research (e.g., the message interpretation process model, storytelling). Regarding the research methods used to assess the interventions, nine of the articles were qualitative (largely interviews, some observations, and focus groups), four were quantitative (e.g., experiments), two combined quantitative and qualitative methods, and two did not specify their research method for assessing the intervention’s outcomes. The size of the intervention audience varied. Three articles focused on fewer than 10 students. Eight of the interventions (in 8 articles) included 20–88 participants and three interventions were administered to hundreds of participants (n = 728–1,845). Three articles did not indicate the size of their intervention sample, although they seemed to be approximately the size of one class.

Answering the research questions

RQ1: Different intervention outcomes

The most prevalent outcome examined as resulting from the interventions was the creation of media content (in 10 articles). The content created across the interventions was diverse and included creating digital ads, blog entries, news stories, video clips, and digital games. All studies assessing this outcome were qualitative in nature. All studies assessing content creation reported positive outcomes, including the successful creation of content, the perceived improvement in production skills and, largely, the enjoyment of the creation process. Participants’ enjoyment of the media content creation process was often attributed to the collaboration it entailed with other students in the project and also with people outside of it (e.g., family members). The sharing of the created content sometimes took place within the intervention (that is, some intervention time was devoted to sharing the creation and discussing it) and sometime through online platforms.

Also in 10 articles, a reported outcome was a critical approach toward media. Three of these studies employed a quantitative experimental methodology. For example, Babad et al. (Citation2012) reported on an intervention that taught students about a specific media bias; in turn, the study reported a successful decrease in the influence of the media bias on the assessment of media content, indicating a more critical approach of audience members to the content. In contrast, Pinkleton et al.’s (Citation2012, Citation2013) experimental studies did not find a significant decrease in sexual advertising desirability following their intervention. The qualitative studies reported that the interventions were linked with a more questioning, doubting stance toward the topics addressed, including cyber-citizenship, media representation, cyber-security, etc. Only Begoray et al. (Citation2013) reported that 7th graders who participated in their intervention about advertising positioning were successful in taking a critical stance toward ads only part of the time; some students remained more accepting of the ads’ intended positioning even after the intervention.

The third most prevalent outcome reported was knowledge and understanding of media (in 9 articles). Two of the articles employed a quantitative methodology and another employed a mixed-method approach, combining survey and interviews. Pinkleton et al (Citation2012, Citation2013), that employed an experimental design, for example, found that participation in their intervention led to a greater understanding of sexual myths presented in the media and of the media’s effects on adolescent sexual decision-making relative to the control (no intervention) group. The remaining six articles were qualitative in nature. All these articles reported a perceived increase in knowledge and awareness of media content, biases, and effects on audience in realms such as cyber-bullying, journalism, and stereotypes. Maqsood and Chiasson (Citation2021) reported that their findings of improved knowledge of cyber-security remained also in a one-week follow-up assessment.

The fourth most prevalent outcome, accomplishing a sense of empowerment, was reported in four studies, all qualitative in nature. These articles described feelings shared by participants in the interventions that made them feel more confident as consumers and creators of media content as well as a perceived sense of heightened self-esteem with regard to the topic of the intervention. For example, following an intervention intended to teach digital storytelling skills to Bhutanian students, Gyabak and Godina (Citation2011) reported that the students developed a “sense of voice” (p. 2242) because they were able to tell their individual stories in a new way, using digital media technology.

Finally, two articles reported outcomes associated with risky media behaviors and behavioral intentions. Maqsood and Chiasson (Citation2021) qualitatively assessed the outcomes of an intervention involving a digital educational game about cyber-security and privacy. They reported that 11–13 year old intervention participants exhibited improvement in online security behavioral intentions. Walther et al. (Citation2014) quantitatively assessed the effects of an intervention focusing on computer gaming and Internet use. They reported that the intervention (versus control) group had fewer participants showing addictive media use patterns over time, though the intervention was not successful in impacting the frequency and duration of Internet use.

RQ2: Intervention outcomes according to participants’ characteristics

Age

Of the 17 articles, only one examined preschool children,10 examined elementary school children, 8 examined middle school children, and 3 examined high school students. Whereas interventions with preschool and elementary school children reported successful outcomes in diverse realms, interventions with middle and high school students reported more mixed findings (see ).

Gender

Only two studies tested gender differences in the effects of an intervention. Pinkleton (Citation2013) found no gender differences in intervention outcomes whereas Pinkleton (Citation2012) reported that boys found advertising messages more desirable (a marginal effect) and were less likely to understand that the media portray sexual myths than girls. In both articles the authors noted that it seems that both genders overall benefited from the digital media literacy intervention about sexual media content.

RQ3: Intervention outcomes with different intervention contexts

Eight of the interventions based their entire lesson plans on content creation; teaching in these interventions only focused on media aspects associated with the production process (e.g., familiarizing the participants with the iPad/computer used in the creation process, critiquing the product created). Content creation activities included inventing characters and storylines as part of maker-literacies in filmmaking (Davis et al. Citation2021) and creating digital games (Sousa et al., Citation2018). Four interventions were theoretical only; that is, participants did not engage in creation of original media content, though they may have practiced the application of concepts learned in the intervention in analyzing media content examples. Five interventions combined theoretical lessons with creation of media content by participants.

The content creation-focused interventions were solely assessed by qualitative methods and all reported a successful achievement of intervention goals across diverse outcomes. First and foremost, all content creation activities were reported to be successful and many reported enjoyment of the participants in the creation process, also due to its collaborative nature. Additional outcomes reported included increased knowledge of media, adopting a more critical approach toward media, and an increased sense of empowerment. The theoretical-only interventions were assessed by three quantitative studies and one qualitative study; they reported largely positive outcomes on knowledge and adopting a critical approach to the media as well as success in reducing risky media behavior and behavioral intentions, which were reported to persist over time. In Walther et al.‘s (Citation2014) theoretical-only intervention, Internet use frequency and duration was not decreased following the intervention. The combined theory-creation interventions were assessed by both qualitative and quantitative methods and reported partial success in achieving intervention goals. Improvements were reported on knowledge about the media, empowerment, and content creation. The adoption of a more critical stance toward the media was only partially achieved in these interventions.

Only six articles reported on interventions that included negative messages about the media, that is, they referred to problematic media messages or the negative effects of the media on audiences (e.g., addressing risky media effects on youth, the importance of protecting one’s privacy, etc.); these interventions reported largely positive outcomes on understanding of media content, adopting a critical approach toward media, and potentially reducing risky media behaviors. A lack of success in impacting the desirability of sexual advertising also resulted from such an intervention. The remaining 11 interventions did not seem to address in their lessons problems associated with the media; rather, the interventions seem to have focused on content creation or learning positive topics about the media (e.g., playing video games, becoming familiar with the computer). These positively-slanted interventions largely reported positive outcomes.

RQ4: Intervention outcomes with different logistics

There were three main categories of intervention logistics: (a) Six interventions were either short in duration or in the number of sessions, including one intervention with one session, and the others ranging from two to five sessions, (b) five interventions included multiple sessions (between approximately 10–18 sessions) or lasted over a period of several months, and (c) six interventions were classified as year-long projects, mostly pedagogical or academic courses (although in some of these, only 5 or 8 sessions were conducted throughout the year).

Of the focused interventions (either short in duration or number of sessions or both), three reported largely positive outcomes and three reported partial success in achieving outcomes. The mid-sized interventions (with multiple sessions spread over at least a few months of an intervention) reported positive outcomes in most studies (four articles) and only one reported partial success. The year-long interventions all reported positive outcomes for the intervention.

Discussion

In the academic literature there is extensive theoretical discussion about the conceptualization of digital media literacy. The literature also contains a wealth of ideas about models and recommendations for digital media literacy interventions and training schemes for teachers. However, little research has been carried out with actual interventions for teaching digital media literacy in schools and little work has been done to understand what interventions might succeed in increasing audiences’ media literacy. The goal of the present study was to fill this gap in the academic literature. Despite the tremendous development of digital media technology and the importance ascribed to digital media literacy as one of the critical skills for the 21st century, this review reveals a small number of studies that explored the outcomes of digital media literacy interventions in the educational system in the past decade.

Even within this small set of studies, it becomes clear that in-school digital media literacy interventions are associated with many positive outcomes for participants. The successful outcomes found in this review refer to an increased understanding of media content, greater awareness of media effects, a more critical approach to media, impact on the sense of competency and empowerment in the context of media use, reduction (or intended reduction) in excessive or risky media use, and successful media content creation. It is interesting to note that, though critical assessment is considered as the core of digital media literacy, only ten of the articles in the current review specifically assessed this outcome for their interventions. seven articles assessed other related outcomes, including content creation and understanding of media. Alongside success in achieving these outcomes, some articles reported partial success in reaching these outcomes, varied according to different dimensions associated with the interventions and their administration, as will now be detailed.

Most interventions were administered in elementary schools and all reported positive outcomes. Fewer interventions were administered to preschool, middle-, and high-school children and these reported positive outcomes but also partial success in achieving desired outcomes. However, caution should be applied when linking younger age with greater success of digital media literacy interventions as more research with children other than in elementary school is needed. Similarly, as only two studies evaluated gender differences in intervention outcomes, conclusions about gender await further investigation.

All interventions that focused on content-creation with no theoretical components in their lessons were reported as being successful. This is not to say that teaching theoretical concepts or ideas is not recommended as most of these interventions (and interventions that combined theory with content creation) were also successful in achieving media literacy goals. Rather, the conclusion seems to be that including a content creation component in in-school digital media literacy interventions seems to be important, perhaps because it enhances the active involvement of participants in the intervention, and, as a result, their enjoyment of the intervention which supports learning and protects from negative media influences (Potter, Citation2013). Another explanation for the importance of content creation activities in in-school media literacy interventions may be that it allows for opportunities for collaboration between students in the classroom. Working together increases enjoyment and allows new learning possibilities (for instance, via mutual feedback on the content created, e.g., Melander Bowden & Aarsand, Citation2020).

In terms of the positive versus negative slant of the intervention messages with regard to the media it seems from this review that both types of interventions were associated with success and partial success in goal achievement. Scholars have been debating the role of positive versus negative messages about the media in campaigns and interventions in impacting audience attitudes and behaviors (e.g., Jacobson et al., Citation2019, Teng et al., Citation2019). The idea of message slant fits within the existing debate in the media literacy scholarship between a protectionist/persuasion versus an empowering approach (Hobbs Citation2011, Potter, Citation2013, Potter & Thai, Citation2019). But more research is needed to draw clear conclusions about this aspect of the interventions as most articles did not provide thorough details about their specific messages across lessons.

The review suggests that the duration and length of the intervention are important for achieving positive outcomes. Longer interventions – in number of sessions and/or length of time for administration – seem to be associated with more consistent positive outcomes of all types. Shorter interventions – either in duration or number of sessions – reported partial success in outcome achievement. The current finding is not entirely consistent with past research. For example, Johnson et al. (Citation2010), in their meta-analysis of health-related behavioral interventions, reported that shorter duration interventions were more successful in impacting desired outcomes than longer interventions. Other intervention research has reported complex findings regarding the effects of intervention duration, with some outcomes being positively impacted by duration length and others not differentially impacted by interventions of different durations (e.g., Watson and Vaugn, Citation2006). It is important to note that the current study focused on media-related outcomes of digital media literacy interventions that took place within the school system, whereas other studies and meta-analyses had a different focus in terms of outcomes and intervention contexts, which may account for the differential findings.

To summarize, three important variables emerged in this review as meaningfully linked with the reported success of digital media literacy interventions achieving their intended outcomes: the age of intervention participants, the duration/number of sessions of the intervention, and the inclusion of a practical component in the intervention lessons. Greater consistent positive outcomes were reported among the younger age groups relative to middle- and high-school students. The partial success among older children may be attributable to the fact that the media are an intrinsic part of children’s lives today and media content is very attractive – two major obstacles to overcome in interventions. If this is the case, reaching children at a younger age with messages about digital media literacy is likely to be more successful as they have yet to develop stringent media use habits and perceptions. It could also be that reaching positive outcomes among an older audience of adolescents is possible when the intervention begins at an earlier age, creating the building blocks necessary for digital media literacy in the long-term. This latter conclusion is consistent with the second meaningful variable identified in the current review – the finding that longer duration and more intervention sessions are associated with more positive outcomes from the intervention. Third, interventions seem to achieve positive outcomes when they combine theoretical with activities of content creation as part of their intervention sessions. Associated with the content production component of interventions are collaboration between peers and game-like characteristics which may garner greater involvement and enjoyment on the part of the intervention participants.

Limitations

The systematic review presented here examined studies of in-school digital media literacy interventions over a period of a decade (2010–2021). Despite an extensive search using academic search engines and leading academics, the final scope of this review consisted of only 17 intervention studies that met all the predefined criteria. Additional digital media literacy intervention studies that have probably been conducted were not documented in the academic literature and therefore could not be included in this review. Furthermore, the studies featured in the systematic review used different research methods, examined different age groups, and discussed different variables and measurements of digital media literacy skills. The differences between the studies, together with a lack of consensus regarding measures for evaluating interventions, makes comparisons across the studies difficult (Hobbs, Citation2011). However, the fact that different research methods were examined, both qualitative and quantitative (similar to Bellon-Harn et al. [Citation2020] and Tandon et al. [Citation2021]), enabled us to understand different aspects of the topic and gain a broad perspective about the intervention studies.

Future research directions

The literature on in-school digital media literacy interventions would benefit from using a coherent theoretical framework in assessing intervention outcomes. Though there are many theories in the media literacy literature, including some that have guided interventions (e.g., the message interpretation process model, Vahedi et al., Citation2018), the current systematic review found that in-school digital media literacy interventions reported in approximately the last decade hardly used the same theory twice to guide their interventions.

Moreover, future research would benefit from examining more stringently certain variables to ascertain their role in different intervention outcomes, including the participants’ gender, the involvement of parents or other socialization agents in the intervention, and cultural context. Cultural context appeared to be relevant in some studies that emphasized the attitudes of the local culture toward digital media technologies and the living conditions of the intervention participants (e.g., in Bhutan, Gyabak & Godina, Citation2011 and Israel, Gozansky, Citation2021) but was hardly elaborated upon in the articles.

Acknowledgments

This work was supported by a grant by the Israeli Ministry of Education.

Disclosure statement

No potential conflict of interest was reported by the author.

Additional information

Funding

The work was supported by the Ministry of Education, Israel.

Notes on contributors

Keren Eyal

Keren Eyal (Ph.D., University of California, Santa Barbara, 2004) is Senior Lecturer in the Sammy Ofer School of Communications in Reichman University. Her research focuses on media content and effects, with a focus on the media’s role in youth socialization. E-mail: [email protected]

Tali Te'eni-Harari

Tali Te’eni-Harari (Ph.D., Bar Ilan University, 2005) is Senior Lecturer in the Business School at Peres Academic Center. Her research focuses on understanding media, advertising, and young people. E-mail: [email protected]

References

  • * Babad, E., Peer, E., & Hobbs, R. (2012b). Media literacy and media bias: Are media literacy students less susceptible to nonverbal judgment biases? Psychology of Popular Media Culture, 1(2), 97–107. https://doi.org/10.1037/a0028181
  • Balshem, H., Helfand, M., Schunemann, H. J., Oxman, A. D., Kunz, R., Brozek, J., Vist, G. E., Falk-Yetter, Y., Meerpohl, J., Norris, S., & Guyatt, G. H. (2011). GRADE guidelines: 3. Rating the quality of evidence. Journal of Clinical Epidemiology, 64(4), 401–406. https://doi.org/10.1016/j.jclinepi.2010.07.015
  • * Begoray, D., Higgins, J. W., Harrison, J., & Collins‐Emery, A. (2013a). Adolescent reading/viewing of advertisements: Understandings from transactional and positioning theory. Journal of Adolescent & Adult Literacy, 57(2), 121–130. https://doi.org/10.1002/JAAL.202
  • Behnamnia, N., Kamsin, A., Ismail, M. A. B., & Hayati, A. (2020). The effective components of creativity in digital game-based learning among young children: A case study. Children & Youth Services Review, 116, 105227. https://doi.org/10.1016/j.childyouth.2020.105227
  • Bellon-Harn, M. L., Morris, L. R., Manchaiah, V., & Harn, W. E. (2020). Use of vídeos and digital media in parent-implemented interventions for parents of children with primary speech sound and/or language disorders: A scoping review. Journal of Child & Family Studies, 29(12), 3596–3608. https://doi.org/10.1007/s10826-020-01842-x
  • Bozzola, E., Spina, G., Agostiniani, R., Barni, S., Russo, R., Scarpato, E., DiMauro, A., DiStefano, A. V., Caruso, C., Corsello, G., & Staiano, A. (2022). The use of social media in children and adolescents: Scoping review on the potential risks. International Journal of Environmental Research and Public Health, 19(16), 9960. https://doi.org/10.3390/ijerph19169960
  • * Cortez-Riggio, K. (2014a). Digital media engagement and the moral/ethical thinking of fifth grade bloggers. Available from ProQuest Central; ProQuest Dissertations & Theses Global. ( 1526443320). Retrieved from https://www.proquest.com/dissertations-theses/digital-media-engagement-moral-ethical-thinking/docview/1526443320/se-2?accountid=38867
  • Davis, K., & James, C. (2013). Tweens’ conceptions of privacy online: Implications for educators. Learning, Media and Technology, 38(1), 4–25. https://doi.org/10.1080/17439884.2012.658404
  • * Davis, S. J., Scott, J. A., Wohlwend, K. E., & Pennington, C. M. (2021a). Bringing joy to school: Engaging K–16 learners through maker literacies and playshops. Teachers College Record, 123(3), 1–23. https://doi.org/10.1177/016146812112300309
  • Denyer, D., & Tranfield, D. (2009). Producing a systematic review. In D. A. Buchanan & A. Bryman (Eds.), The sage handbook of organizational research methods (pp. 671–689). Sage.
  • Downs, S. H., & Black, N. (1998). The feasibility of creating a checklist for the assessment of the methodological quality of both randomised and non-randomised studies of health care interventions. Journal of Epidemiology & Community Health, 52(6), 377–384. http://dx.doi.org/10.1136/jech.52.6.377
  • * Escoda, A. P. (2013c). Media literacy in primary school: New challenges in the digital age. Teoría De La Educación: Educación y Cultura En La Sociedad De La Información, 15(1), 43–69.
  • * Gozansky, Y. (2021c). How a hands-on workshop offered by communication undergraduates in Israel enhanced fifth graders’ news literacy skills. The Journal of Media Literacy Education, 13(1), 131–137. https://doi.org/10.23860/JMLE-2021-13-1-11
  • * Gyabak, K., & Godina, H. (2011). Digital storytelling in Bhutan: A qualitative examination of new media tools used to bridge the digital divide in a rural community school. Computers & Education, 57(4), 2236–2243. https://doi.org/10.1016/j.compedu.2011.06.009
  • Hobbs, R. (2011). What a difference ten years can make: Research possibilities for the future of media literacy education. The Journal of Media Literacy Education, 3(1), 29–31. https://doi.org/10.23860/jmle-3-1-11
  • * Hobbs, R., Cabral, N., Ebrahimi, A., Yoon, J., & Al-Humaidan, R. (2010b, June). Combating middle East stereotypes through media literacy education in elementary school. Presented in the annual meeting of the International Communication Association, Singapore.
  • * Husbye, N. E., Buchholz, B., Coggin, L. S., Powell, C. W., Wohlwend, K. E., & Fink, L. (2012a). Critical lessons and playful literacies: Digital media in PK-2 classrooms. Language Arts, 90(2), 82–92.
  • Israeli Ministry of Education. (n.d.). 21st Century Skills in Digital Media Tools [ Hebrew], https://edu.gov.il/minhalpedagogy/preschool/subject/science-and-technology/digital-media-lib/Pages/21st-century-skills.asp
  • Jacobson, S. K., Morales, N. A., Chen, B., Soodeen, R., Moulton, M. P., & Jain, E. (2019). Love or loss: Effective message framing to promote environmental conservation. Applied Environmental Education and Communication, 18(3), 252–265. https://doi.org/10.1080/1533015X.2018.1456380
  • Jeong, S. H., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62(3), 454–472. https://doi.org/10.1111/j.1460-2466.2012.01643.x
  • Johnson, B. T., Scott-Sheldon, L. A. J., & Carey, M. P. (2010). Meta-synthesis of health behavior change meta-analyses. American Journal of Public Health, 100(11), 2193–2198. https://doi.org/10.2105/AJPH.2008.155200
  • Kurz, M., Rosendahl, J., Rodeck, J., Muehleck, J., & Berger, U. (2022). School-based interventions improve body image and media literacy in youth: A systematic review and meta-analysis. Journal of Prevention, 43(1), 5–23. https://doi.org/10.1007/s10935-021-00660-1
  • Liberati, A., Altman, D. G., Tetzlaff, J., Murlow, C., Gotzsche, P. C., Ioannidis, J. P. A., Clarke, M., Devereaux, P. J., Kleijnen, J., & Moher, D. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. Journal of Clinical Epidemiology, 62(10), e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006
  • Livingstone, S., Mascheroni, G., & Stoilova, M. (2021). The outcomes of gaining digital skills for young people’s lives and wellbeing: A systematic evidence review. New Media & Society, 25(5), 1–27. https://doi.org/10.1177/14614448211043189
  • Livingstone, S., Stoilova, M., & Nandagiri, R. (2019). Children’s data and privacy online: Growing up in a digital age: An evidence review. London: London School of Economics and Political Science. http:///www.Users/tai-pac/Documents/isf%202022/%D7%9E%D7%90%D7%9E%D7%A8%D7%99%D7%9D%20%D7%A2%D7%9C%20%D7%A4%D7%A8%D7%98%D7%99%D7%95%D7%AA/Livingstone%20et%20al%202019.pdf
  • * Maqsood, S., & Chiasson, S. (2021b). Design, development, and evaluation of a cybersecurity, privacy, and digital literacy game for tweens. ACM Transactions on Privacy and Security, 24(4), 1–37. https://doi.org/10.1145/3469821
  • Martin, A., & Grudziecki, J. (2006). DigEuLit: Concepts and tools for digital literacy development. Innovation in Teaching and Learning in Information and Computer Sciences, 5(4), 249–267. https://doi.org/10.11120/ital.2006.05040249
  • * Melander Bowden, H., & Aarsand, P. (2020). Designing and assessing digital games in a classroom: An emerging culture of critique. Learning, Media and Technology, 45(4), 376–394. https://doi.org/10.1080/17439884.2020.1727500
  • Navarra, G. A., Thomas, E., Scardina, A., Izadi, M., Zangla, D., De Dominicis, S., & Bellafiore, M. (2021). Effective strategies for promoting physical activity through the use of digital media among school-age children: A systematic review. Sustainability, 13(20), 11270. https://doi.org/10.3390/su132011270
  • Ofcom. (2022). Children and parents: Media use and attitudes report. https://www.ofcom.org.uk/data/assets/pdf_file/0024/234609/childrens-media-use-and-attitudes-report-2022.pdf
  • Pinkleton, B. E., Austin, E. W., Chen, Y. C. Y., & Cohen, M. (2012). The role of media literacy in shaping adolescents' understanding of and responses to sexual portrayals in mass media. Journal of Health Communication, 17(4), 460–476‏‏. https://doi.org/10.1080/10810730.2011.635770
  • * Pinkleton, B. E., Austin, E. W., Chen, Y. C. Y., & Cohen, M. (2012c). The role of media literacy in shaping adolescents’ understanding of and responses to sexual portrayals in mass media. Journal of Health Communication, 17(4), 460–476. https://doi.org/10.1080/10810730.2011.635770
  • Pinkleton, B. E., Austin, E. W., Chen, Y. C. Y., & Cohen, M. (2013). Assessing effects of a media literacy-based intervention on US adolescents' responses to and interpretations of sexual media messages. Journal of Children and Media, 7(4), 463–479‏. https://doi.org/10.1080/17482798.2013.781512
  • * Pinkleton, B. E., Austin, E. W., Chen, Y. C. Y., & Cohen, M. (2013b). Assessing effects of a media literacy-based intervention on US adolescents’ responses to and interpretations of sexual media messages. Journal of Children and Media, 7(4), 463–479. https://doi.org/10.1080/17482798.2013.781512
  • Potter, W. J. (2013). Review of literature on media literacy. Sociology Compass, 7(6), 417–435. https://doi.org/10.1111/soc4.12041
  • Potter, W. J., & Thai, C. L. (2019). Reviewing media literacy intervention studies for validity. Review of Communication Research, 7, 38–66. https://doi.org/10.12840/ISSN.2255-4165.018
  • Rideout, V., Peebles, A., Mann, S., & Robb, M. B. (2022). Common sense census: Media use by tweens and teens, 2021. Common Sense Media. chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-final-web_0.pdf
  • * Schmier, S. (2014c). Popular culture in a digital media studies classroom. Literacy, 48(1), 39–46. https://doi.org/10.1111/lit.12025
  • See, B. H., Gorard, S., El-Soufi, N., Lu, B., Siddiqui, N., & Dong, L. (2020). A systematic review of the impact of technology-mediated parental engagement on student outcomes. Educational Research & Evaluation, 26(3–4), 150–181. https://doi.org/10.1080/13803611.2021.1924791
  • Shea, B. J., Reeves, B. C., Wells, G., Thuku, M., Hamel, C., Moran, J., Moher, D., Tugwell, P., Welch, V., Kristjansson, E., & Henry, D. A. (2017). AMSTAR 2: A critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ, 358–366. https://doi.org/10.1136/bmj.j4008
  • * Solomon, M. J. (2010a). The need for (digital) story: First graders using digital tools to tell stories. Available from ProQuest Central; ProQuest Dissertations & Theses Global. ( 748254538). Retrieved from https://www.proquest.com/dissertations-theses/need-digital-story-first-graders-using-tools-tell/docview/748254538/se-2?accountid=38867
  • * Sousa, C., Cardoso, D., Costa, C., & Tyner, K. (2018). Making games, making literacy: A case-study in formal educational contexts. Academic Conferences International Limited. Retrieved from https://www.proquest.com/conference-papers-proceedings/making-games-literacy-case-study-formal/docview/2131785394/se-2?accountid=38867
  • Tandon, A., Dhir, A., & Mäntymäki, M. (2021). Jealousy due to social media? A systematic literature review and framework of social media-induced jealousy. Internet Research, 31(5), 1541–1582. https://doi.org/10.1108/INTR-02-2020-0103
  • Teng, L., Zhao, G., Wu, Y., Fu, H., & Wang, J. (2019). Positive versus negative messaging in discouraging drunken driving: Matching behavior consequences with target groups. Journal of Advertising Research, 59(2), 185–195. https://doi.org/10.2501/JAR-2018-029
  • Turin, O., & Friesem, Y. (2020). Is that media literacy?: Israeli and US media scholars’ perceptions of the field. The Journal of Media Literacy Education, 12(1), 132–144. https://doi.org/10.23860/JMLE-2020-12-1-10
  • Vahedi, Z., Sibalis, A., & Sutherland, J. E. (2018). Are media literacy interventions effective at changing attitudes and intentions towards risky health behaviors in adolescents? A meta-analytic review. Journal of Adolescence, 67(1), 140–152. https://doi.org/10.1016/j.adolescence.2018.06.007
  • Vannucci, A., Simpson, E. G., Gagnon, S., & Ohannessian, C. M. (2020). Social media use and risky behaviors in adolescents: A meta-analysis. Journal of Adolescence, 79(1), 258–274. https://doi.org/10.1016/j.adolescence.2020.01.014
  • * Walther, B., Hanewinkel, R., & Morgenstern, M. (2014b). Effects of a brief school-based media literacy intervention on digital media use in adolescents: Cluster randomized controlled trial. Cyberpsychology, Behavior and Social Networking, 17(9), 616–623. https://doi.org/10.1089/cyber.2014.0173
  • Watson, R., & Vaugn, L. M. (2006). Limiting the effects of the media on body image: Does the length of a media literacy intervention make a difference? Eating Disorders, 14(5), 385–400. https://doi.org/10.1080/10640260600952530
  • Xie, X., Gai, X., & Zhou, Y. (2019). A meta-analysis of media literacy interventions for deviant behaviors. Computers & Education, 139, 146–156. https://doi.org/10.1016/j.compedu.2019.05.008

Appendix A

– Systematic Review Databases

Appendix B

– Systematic Review Coding Protocol

Appendix C

– Rationale for the Assessment of the Certainty of Evidence in Articles

Several guidelines have been suggested to evaluate the quality of the articles included in systematic reviews (e.g., the GRADE guidelines, Balshem et al., Citation2011; Downs and Black’s checklist, Citation1998). These checklists focus on the weight of evidence presented in articles in order to ascertain the risk of bias and the level of certainty that can be placed in their conclusions. These guidelines and checklists tend to be heavily quantitative-skewed – that is, many of their criteria are appropriate only for quantitative investigations (e.g., assessing statistical analyses and quantitative data reporting). As such, they did not seem to adequately fit with the current review, which also includes many qualitative investigations, as is characteristic of the digital media literacy intervention field. Thus, this review assessed the certainty of evidence based on an overall assessment of each article following the criteria suggested in existing checklists but modified to also fit qualitative studies. A similar approach of global assessments that are review-specific was utilized in the Weight of Evidence framework presented by Livingstone et al. (Citation2021) in their systematic review of research on adolescents’ digital skills.