3,661
Views
0
CrossRef citations to date
0
Altmetric
Special Topic Section Introduction to Meta-Analyses and Systematic Reviews

Meta-Analyses and Systematic Reviews Advancing the Practice of School Psychology: The Imperative of Bringing Science to Practice

Abstract

Meta-analyses and structured reviews provide a synthesis of the results of multiple studies, thus offering unique contributions to informing practice and advancing science in the field of school psychology. High-quality meta-analyses and structured reviews include a review and summary of the extant studies and highlight clear directions for future research. Given the ongoing efforts of scholars publishing studies with specific samples, the meta-analyses and structured reviews allow for considering the collective evidence across samples and populations. The contents herein highlight (a) the importance of meta-analyses and systematic reviews, (b) elements of rigorous, high quality meta-analyses and systematic reviews, and (c) contemporary meta-analyses and systematic reviews featured in School Psychology Review. These important meta-analyses and systematic reviews will benefit school psychology practitioners, scholars, and graduate students.

Impact Statement

This article highlights the contributions of meta-analyses and systematic reviews in bringing science to practice. School psychologists and other education professionals benefit from the synthesis of the extant literature providing knowledge and insights that inform best practices in supporting the healthy development and well-being of children and youth.

School psychology requires mastery of multiple content areas (National Association of School Psychologists [NASP], Citation2020); thus, staying informed across topic areas can be challenging even for dedicated professionals. Practitioners, graduate educators, and scholars seeking to stay abreast of contemporary research developments benefit from systematic reviews and meta-analyses (SRMAs) that synthesize findings across numerous studies to provide an organized and intelligible analysis of a topic’s evidence state. The evidence synthesis movement (Cochrane, Citation2020; Evidence for Policy and Practice Information (EPPI), Citation2019; Royal Society, Citation2018) has built a scientific approach to compiling and disentangling the findings of multiple reports across various scientific journals and repositories, allowing researchers to identify and close knowledge gaps. These efforts distill multifaceted and, at times, seemingly contradictory evidence to isolate the components that vary in effectiveness depending on contextual factors. The result of this work allows us to clarify the empirical evidence that informs our work with students within the complex ecology of schools, thus, further strengthening and advancing science-to-practice contributions.

This special topic section catalogs critical findings of meta-analyses and systematic reviews that advance science and inform practice. The articles herein address multiple domains of practice (NASP, Citation2020), including screening procedures, prevention programs, intervention components, and supervision practices, to provide knowledge and inform future practice and research. This introduction highlights the development of SRMA techniques and their promise to advance and inform the science and practice of school psychology. It also provides a brief overview, highlighting the contributions of each article included in this issue.

THE DEVELOPMENT OF SRMA TECHNIQUES

Pearson’s (Citation1904) report on enteric fever inoculation statistics is the first recognized use of a meta-analysis of clinical trials. Much has changed since the 1904 report; with methodological refinements and statistical advances, the approach is now widely employed across academic fields. As meta-analytic methods continued to develop, advances in the search methodology distinguished it from other quantitative analytical techniques. Systematic reviews formally emerged in the late 1970s and early 1980s with the development of the Cochrane Collaboration (EPPI, 2022). Collectively, the evidence synthesis methodologies allow researchers to organize and synthesize vast amounts of information into meaningful and cogent summaries of key findings.

Early meta-analysis techniques were substantially restricted by the types of data that could be analyzed. In contrast, contemporary meta-analytic techniques synthesize data using varied approaches. Similarly, network and subgroup meta-analytic techniques allow researchers to examine multiple effects across varied interventions or sub-populations. Additionally, developments in funnel plots and the associated statistical tests have advanced efforts to address publication biases (Stanley & Dou­couliagos, Citation2014; van Aert et al., Citation2019). Collectively, these techniques offer scholars powerful tools to answer questions across the array of domains within school psychology without sacrificing the contextual nuances that make research findings clinically meaningful.

The methodology of systematic reviews is an approach to evidence synthesis that was prompted in response to the growing need for replicable and reproducible methods (Bollen et al., Citation2015; Chalmers et al., Citation2002). Recent advancements include the development of guidelines for conducting reviews (e.g., Aromataris & Munn, Citation2020; Polanin et al., Citation2019; Tawfik et al., Citation2019) and reporting the findings (e.g., American Psychological Association, Citation2008; Page et al., Citation2020, Citation2021; Siddaway et al., Citation2019), as well as frameworks that define study parameters and identify gaps. For example, the Population, Intervention, Comparison, Outcome, and Setting (PICOS; Robinson et al., Citation2013) is a specialized framework that aids researchers in formulating research questions and facilitates literature review, and is easily modified to account for other variables, such as time or the distinct demands of qualitative reviews (Methley et al., Citation2014). As another example, the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (Page et al., Citation2020, Citation2021; PRISMA, 2020) is a 27-item checklist intended to improve transparency and reporting of systematic reviews, which includes an equity extension (PRISMA-E, 2012; Welch et al., Citation2015) to help systematic reviewers identify, extract, and synthesize evidence on equity in systematic reviews. These advancements provide guidelines and frameworks for researchers to improve the evidence synthesis paradigm.

Scoping reviews are another component of the evidence synthesis movement and represent a less well-known methodology used to structure the research evidence. Scoping and systematic reviews emphasize rigorous, transparent, and reproducible methodology to create a clear picture of the scientific literature on a given topic. Although the structured process across the review types is similar, their purpose differs. Scoping reviews aim to provide an overview or map of the concepts and evidence in the literature and are often used to identify the volume of the available studies. In contrast, systematic reviews provide answers to a single or precise question, such as the feasibility or effectiveness of a specific practice or treatment (Armstrong et al., Citation2011). Thus, systematic reviews, scoping reviews, and meta-analyses, while related, offer unique approaches with distinctive methodological advantages to the scientific community. The relationship between these methodologies is illustrated in .

Figure 1. Breath and Depth of Evidence Synthesis Methods

Figure 1. Breath and Depth of Evidence Synthesis Methods

Increasingly, many fields of science are moving toward the Open Science Framework (OSF; Center for Open Science, Citation2022). The OSF promotes open and transparent research processes, including the development of research ideas, study design, storing and analyzing data, and writing and publishing reports or papers. OSF’s processes are a sign of the ever-refining efforts to improve the scientific process and are a welcomed development in our field. The evidence synthesis procedures and paradigm complement the OSF movement, giving consumers confidence that the findings are representative, comprehensive, and rigorous.

Looking forward, software programs to facilitate screening and data extraction are the next evolution within SRMA techniques. Multiple programs now use machine learning software to screen large databases (Khalil et al., Citation2022). Additional developments in artificial intelligence to screen and catalog these databases are currently emerging. These programs promise to advance the evidence synthesis movement by further reducing the time required to identify studies and conduct reviews. While these developments offer researchers advantages in efficiency, they simultaneously create new challenges and considerations, including additional training to learn how to operate these programs. Researchers must also identify decision-making guidelines, so efficiency advances do not sacrifice transparency.

EVIDENCE SYNTHESIS IN SCHOOL PSYCHOLOGY REVIEW

Supporting practice, science, and policy decisions with scientifically-informed evidence is a fundamental objective of School Psychology Review, and the evidence synthesis movement aligns with this purpose. One of the first articles within the evidence synthesis movement published in School Psychology Review was Kavale’s (Citation1988) article exploring the use of meta-analysis to answer questions about the amenable variables affecting learning in schools. Notably, while Kavale’s article touted the benefits of the methodology for the field, the article did not include a meta-analysis. It would be four more years until the first meta-analysis would be published in School Psychology Review, with that distinction going to Swanson and Malone (Citation1992) article examining the relationship between social skills and learning disabilities. Among notable early contributions, Jimerson (Citation2001) published the journal’s most cited meta-analysis to date examining grade retention research, highlighting the deleterious impacts, and emphasizing the implications for school psychologists in advocating for evidence-based interventions to support the social, emotional, and academic development of children. Polanin et al. (Citation2012) conducted one of the most cited meta-analyses published in School Psychology Review. In their meta-analysis, Polanin and colleagues examined the importance of incorporating bystander intervention components to supplement bullying prevention programs. Polanin and colleagues’ findings remain relevant and are extended in the current issue with Torgal and colleagues’ (Citation2023) examination of school-based cyberbullying intervention programs and their effectiveness in promoting active cyber-bystander behavior in K–12 settings.

Systematic review’s more recent methodological development is also reflected in School Psychology Review publications. For example, Eklund et al. (Citation2018) published a dedicated systematic review of state-level social–emotional learning (SEL) standards in School Psychology Review. The authors examined state-level social-emotional learning standards across age, grade, and content areas and discussed the development of age- and grade-appropriate SEL standards. Given the increasing attention to supporting social-emotional development at school, it is notable that the article is the most cited systematic review published by the journal in recent years. In the past five years, eight additional systematic reviews have been published in School Psychology Review (See ), five of which are included in the current issue. One of these articles, Brann et al. (Citation2022), provided a systematic review of the usability of the social, emotional, and behavioral (SEB) assessment and identified the lack of and limitations of the usability research in SEB assessment domains.

Table 1. Meta-Analyses and Systematic Reviews Published in School Psychology Review (in the past 5 Years)

Two articles, including Alexander and Reynolds (Citation2020) and Schnorrbusch et al. (Citation2020), conducted meta-analyses to identify potential contributors to the diagnosis or misdiagnosis of disabilities. Specifically, Alexander and Reynolds (Citation2020) conducted a meta-analysis to examine the population correlation between intelligence and adaptive behavior as vital components in diagnosing intellectual disability. Their findings highlighted how identifying intellectual disability is influenced by the correlation between intelligence and adaptive behavior; these findings inform practice and policy relevant to its eligibility criteria and diagnosis. Furthermore, Schnorrbusch et al. (Citation2020) explored the impacts of the relative age effect on attention deficit hyperactivity disorder (ADHD) in children, indicating that the relative age in the class has implications on the diagnosis and treatment practices for children with ADHD in schools.

In the following two articles, Eklund et al. (Citation2022) and Van Camp et al. (Citation2020) conducted meta-analyses to examine the effectiveness of practices and interventions. For example, Eklund et al. (Citation2022) examined the effects of three types of interventions (i.e., family-school partnerships, behavior interventions, and academic interventions) on student attendance in pre-K–12 public schools. All three categories of interventions resulted in small effects, indicating that most frequently implemented practices lead to minimal positive outcomes. These findings suggest further research is needed on identifying and evaluating the ­effectiveness of new intervention approaches to improve student attendance. Similarly, Van Camp et al. (Citation2020) ­exa­mined the effectiveness of opportunities to respond (OTRs) in intensifying academic and behavioral intervention outcomes, which extended the literature by identifying specific interventions that can benefit from increased OTRs.

Finally, King et al. (Citation2022) and Zakszeski and Rutherford (Citation2021) provided systematic reviews of research on relatively recent practices and programs. King et al. (Citation2022) reviewed the growing literature on school teleconsultation and provided a summary of key factors contributing to the successful delivery of teleconsultation. These reviews were timely and significant given the current era of teleservices since the COVID-19 pandemic. Zakszeski and Rutherford (Citation2021) systemic review indicated shortcomings of the existing research on restorative justice approaches in schools, such as lacking clear definitions, strategic guidelines and support for implementation, and evaluation plans. These results indicated that schools’ adoption of restorative justice approaches had outpaced empirical evidence, likely contributing to a sustained “practice-to-research” gap.

Amidst the increasing demands for social, emotional, behavioral, and academic support, coupled with the shortages of school psychologists in many schools across the country, advancements in SRMAs have arrived at a critical juncture for school psychology. We need clear and concise synthesis and summaries on how to provide effective school-based services efficiently. SRMA techniques contribute by facilitating researchers’ efforts to amass information and summarize the state of efficacious practices for school psychology practitioners and identify gaps in knowledge that warrant further focus from scholars.

OVERVIEW OF THESE CONTEMPORARY ARTICLES

The articles included in this special topic section are the recent meta-analyses and systematic reviews featuring the most recent evidence synthesis articles to inform practice and future research (See ). First, Torgal et al. (Citation2023) expand on the earlier bystander findings reported by Polanin et al. (Citation2012) by examining school-based prevention programs targeting cyberbullying bystander behavior in online contexts. Their work systematically compiled evidence from 9 studies comprising 35 effect sizes. From this study, we gain a clearer understanding of program characteristics associated with improved efficacy. These findings help inform programs aiming to improve interventions for online bystander behaviors.

Figure 2. Contributions of Articles to the Systematic Review and Meta-Analyses Literature

Figure 2. Contributions of Articles to the Systematic Review and Meta-Analyses Literature

Continuing with a focus on studies addressing bullying, ten Bokkel et al. (Citation2023) synthesized the evidence of teacher–student relationship quality and its effect on bullying perpetration and peer victimization. In their meta-analysis, ten Bokkel and colleagues examined 65 reports encompassing 185,881 students from preschool to high school. The authors conducted separate multilevel analyses for studies that addressed bullying perpetration and those that addressed peer victimization. This study also incorporated moderating variables such as ethnicity, informant, and time between measures. This analysis synthesized years of inconclusive results, revealing a negative relationship between high-quality teacher–student relationships and bullying. Thus, further highlighting the importance of establishing high-quality teacher–student relationships as a protective factor against bullying behaviors and victimization. High-quality teacher–student relationships are further emphasized when the student reporting is an ­ethnic minority. The authors also point to the stability of these findings across time, which suggests that once formed, the relationship dynamics remain constant across the school year.

Moving to mental-health screening practices, Villarreal et al. (Citation2023) examined 38 studies investigating gated screening practices. In their analysis, the authors found that participation rates were affected by consent (i.e., active or passive) practices and the gated screening’s point (i.e., initial or secondary). Screening practices that required active consent were associated with approximately 58% participation rates, whereas passive consent participation rates were substantially higher at approximately 96%. While participation rates were higher in the secondary screening phase, over a quarter did not participate in the initial screening, and nearly one-fifth of those eligible did not participate in secondary screening. The results of Villarreal and colleagues’ synthesis shed light on some sources of low participation rates.

Marinucci et al. (Citation2023) bring us the first scoping review published in School Psychology Review. Their report synthesizes school-based mental health literacy (MHL) interventions. MHL looks beyond the standard treatment of mental illness and focuses on interventions that sustain good mental health in the absence of illness. Marinucci and colleagues synthesized 27 studies investigating MHL interventions to identify the major components and provide quality ratings of the existing literature. The analysis revealed that seven major components had been explored in school-based MHL interventions ranging from recognizing symptoms of mental illness and implementing appropriate coping skills to program delivery methods. The authors identified considerable variability in the quality ratings of the studies included in their synthesis. Looking forward, the authors suggest five indicators for researchers to address in future studies.

Cleary et al. (Citation2023) provided a systematic review of the literature on self-regulated learning (SRL) microanalysis between 1997 and 2020. The final set included 42 peer-reviewed articles and dissertations that represented the use and application of SRL microanalysis across diverse phases (i.e., forethought, performance, and self-reflection), domains (e.g., academics, music, athletics), activities (e.g., problem-solving, reasoning), and populations (e.g., from elementary to graduate school). This systematic review also identified recent research trends in using SRL microanalysis as a diagnostic or instructional tool to enhance intervention and instructional effects rather than testing its effectiveness as an intervention itself. These findings support the flexibility and adaptability of SRL microanalysis, suggesting that its use is not limited to a certain domain, task, or grade. Instead, this assessment approach can be applied to diverse areas that require students’ regulatory skills.

Another study by Alperin et al. (Citation2023) was the first to systematically review the literature on the effects of behavioral interventions on implementers and student outcomes. This study included 51 studies published between 2000 and 2020, including 6,498 middle school students with disruptive behaviors and 264 implementers. Results emphasize four core dimensions of the identified studies, including sample characteristics, intervention components, research methodology, and reported outcomes, revealing the gaps within the current literature. Fur­thermore, results revealed small to large positive effects on teacher practices and skills, such as behavior management techniques and instructional support skills, as well as student behavioral outcomes, such as disruptive behavior, and more specifically, on-task behavior, academic performance, and self-regulation. Finally, this study highlighted the strengths and weaknesses of the existing literature for behavior interventions implemented with middle school students who display disruptive behaviors, offering avenues for future research with the population.

Despite the increase in the number of anti-Islamic hate crimes and discriminatory acts across the globe, there is a limited understanding of the scope of religious discrimination toward Muslim youths and its associated outcomes. Thus, Abu Khalaf et al. (Citation2023) conducted a mixed methods systematic review of 44 qualitative and quantitative studies on the impacts of Islamophobia on Muslim youths in the United States and internationally to assess gaps and trends in their experiences of discrimination. Findings suggested different discrimination experiences (e.g., bullying, harassment, and exclusion) and what ­factors contributed to students experiencing higher rates of ­discrimination, such as cultural and religious engagement, appearance, ethnicity, and immigration status. Further­more, the review provided a summary of students’ responses to discrimination (e.g., ignoring, in-group socialization, acculturation), the roles of schools in allowing and perpetuating religious discrimination, associated student outcomes, and protective factors for handling discrimination. Based on its findings, this systematic review made recommendations for developing culturally responsive and equitable environments and policies for all youths.

Given the long history of workforce shortages in school psychology, the field warrants a thorough review of mentoring research to support evidence-based practices and close the research-practice gap. Grapin et al. (Citation2023) provide a systematic review of mentoring research in school psychology, including 16 empirical studies published between 1988 and 2020. This systematic review first provided methodological characteristics of included studies. All articles were exclusively cross-sectional, limiting our understanding of the processes and long-term impacts. Additionally, a qualitative thematic synthesis reported three key findings: 1) context of mentoring, 2) race, ethnicity, sex, and gender in mentoring, and 3) functions of mentoring. The summary of these key themes highlighted potential benefits, including access to role models, career-related guidance, and psychosocial well-being, as well as barriers to mentoring individuals with racial, ethnic, sex, and gender-minoritized identities. Furthermore, the review pointed out that most studies still need to address the intersectionality of identities and their impacts on mentorship, warranting future research on a more comprehensive and empirically robust approach to improving mentoring practices in school psychology.

DISCUSSION AND FUTURE DIRECTIONS

The articles herein synthesize the latest empirical developments across multiple NASP practice domains. SRMAs are important in further developing rigorous methods that are transparent, replicable, and improve the scientific basis of our collective knowledge. Professionals seeking to review the latest in scientific developments efficiently can have confidence in high-quality SRMAs to aggregate and summarize the latest empirical developments. The work in this issue clarifies pressing concerns in the identified topics, and implications for practice, while also illuminating new paths for discovery.

SRMA methodology holds a promising future and will persist in informing the science and practice of school psychology. Our efforts to advocate and implement effective prevention and intervention services are the foundation on which the field of school psychology is built. As the field matures and school psychologists investigate new areas of practice, SRMA methodology will continue to advance and be used to cultivate the strengths of our work while also providing directions regarding less efficacious practices.

DISCLOSURE

The authors have no conflicts of interest to report.

Additional information

Notes on contributors

Justin P. Allen

Justin P. Allen is an Assistant Professor at Sam Houston State University where he is the Associate Director of the School Psychology Program. Dr. Allen’s research examines assessment and intervention practices within the context of manifestation determination reviews, he also maintains an interest in advancing structured reviews to advance empirically informed practices. Currently, he serves as an Editorial Fellow with School Psychology Review.

Eui Kyung Kim

Eui Kyung Kim, PhD, is an Assistant Professor in the School of Education at University of California, Riverside and Nationally Certified School Psychologist. Dr. Kim’s research focuses on understanding the pathways to risk and resilience among underrepresented populations in K–12 schools and multiculturally responsive graduate recruitment, retention, and training. She currently serves as an Editorial Fellow with School Psychology Review.

Shane R. Jimerson

Shane R. Jimerson, PhD is a Professor University of California, Santa Barbara and Nationally Certified School Psychologist. His scholarship focuses on understanding and supporting the social, emotional, behavioral, academic, and mental health development of youth and also understanding and advancing the field of school psychology internationally.

REFERENCES

  • Abu Khalaf, N., Woolweaver, A. B., Marmolejos, R. R., Little, G. A., Burnett, K., & Espelage, D. L. (2023). The impact of Islamophobia on Muslim students: A systematic review of the literature. School Psychology Review, 52(2), 206–223. https://doi.org/10.1080/2372966X.2022.2075710
  • Alexander, R. M., & Reynolds, M. R. (2020). Intelligence and adaptive behavior: A meta-analysis. School Psychology Review, 49(2), 85–110. https://doi.org/10.1080/2372966X.2020.1717374
  • Alperin, A., Reddy, L. A., Glover, T. A., Bronstein, B., Wiggs, N. B., & Dudek, C. M. (2023). School-based interventions for middle school students with disruptive behaviors: A systematic review of components and methodology. School Psychology Review, 52(2), 180–205. https://doi.org/10.1080/2372966X.2021.1883996
  • APA Publications and Communications Board Working Group on Journal Article Reporting Standards. (2008). Reporting standards for research in psychology: Why do we need them? What might they be? American Psychologist, 63(9), 839–851. https://www.apa.org/pubs/authors/jars.pdf
  • Armstrong, R., Hall, B. J., Doyle, J., & Waters, E. (2011). Cochrane Update. ‘Scoping the scope’ of a cochrane review. Journal of Public Health, 33(1), 147–150. https://doi.org/10.1093/pubmed/fdr015
  • Aromataris, E., & Munn, Z. (2020). JBI manual for evidence synthesis. JBI. https://doi.org/10.46658/JBIMES-20-02
  • Bollen, K., Cacioppo, J. T., Kaplan, R. M., Krosnick, J. A., & Olds, J. L. (2015). Social, Behavioral, and Economic sciences perspectives on robust and reliable science. Report of the subcommittee on replicability in Science Advisory Committee to the National Science Foundation for Social, Behavioral, and Economic Sciences. National Science Foundation. https://www.nsf.gov/sbe/AC_Materials/SBE_Robust_and_Reliable_Research_Report.pdf
  • Brann, K. L., Daniels, B., Chafouleas, S. M., & DiOrio, C. A. (2022). Usability of social, emotional, and behavioral assessments in schools: A systematic review from 2009 to 2019. School Psychology Review, 51(1), 6–24. https://doi.org/10.1080/2372966X.2020.1836518
  • Center for Open Science. (2022). Open Science Framework (OSF). https://www.cos.io/
  • Chalmers, I., Hedges, L. V., & Cooper, H. (2002). A brief history of research synthesis. Evaluation & the Health Professions, 25(1), 12–37. https://doi.org/10.1177/0163278702025001003
  • Cleary, T. J., Slemp, J., Reddy, L. A., Alperin, A., Lui, A., Austin, A., & Cedar, T. (2023). Characteristics and uses of SRL microanalysis across diverse contexts, tasks, and populations: A systematic review. School Psychology Review, 52(2), 159–179. https://doi.org/10.1080/2372966X.2020.1862627
  • Cochrane. (2020, January 3). Evidence synthesis - What is it and why do we need it? Retrieved January 22, 2023, from https://www.cochrane.org/news/evidence-synthesis-what-it-and-why-do-we-need-it
  • Eklund, K., Burns, M. K., Oyen, K., DeMarchena, S., & McCollom, E. M. (2022). Addressing chronic absenteeism in schools: A meta-analysis of evidence-based interventions. School Psychology Review, 51(1), 95–111. https://doi.org/10.1080/2372966X.2020.1789436
  • Eklund, K., Kilpatrick, K. D., Kilgus, S. P., & Haider, A. (2018). A systematic review of state-level social–emotional learning standards: Implications for practice and research. School Psychology Review, 47(3), 316–326. https://doi.org/10.17105/SPR-2017.0116.V47-3
  • Evidence for Policy and Practice Information (EPPI). (2019). History of systematic reviews. https://eppi.ioe.ac.uk/cms/Resources/EvidenceInformedPolicyandPractice/HistoryofSystematicReviews/tabid/68/Default.aspx
  • Grapin, S. L., Collier-Meek, M. A., January, S.-A. A., Yang, C., & Portillo, N. L. (2023). Reconceptualizing mentorship for the 21st century: A systematic mapping of research in school psychology. School Psychology Review, 52(2), 224–242. https://doi.org/10.1080/2372966X.2021.1910861
  • Jimerson, S. R. (2001). Meta-analysis of grade retention research: Implications for practice in the 21st century. School Psychology Review, 30(3), 420–437. https://doi.org/10.1080/02796015.2001.12086124
  • Kavale, K. A. (1988). Using meta-analysis to answer the question: What are the important, manipulable influences on school learning? School Psychology Review, 17(4), 644–650. https://doi.org/10.1080/02796015.1988.12085382
  • Khalil, H., Ameen, D., & Zarnegar, A. (2022). Tools to support the automation of systematic reviews: A scoping review. Journal of Clinical Epidemiology, 144, 22–42. https://doi.org/10.1016/j.jclinepi.2021.12.005
  • King, H. C., Bloomfield, B. S., Wu, S., & Fischer, A. J. (2022). A systematic review of school teleconsultation: Implications for research and practice. School Psychology Review, 51(2), 237–256. https://doi.org/10.1080/2372966X.2021.1894478
  • Marinucci, A., Grové, C., & Allen, K.-A. (2023). A scoping review and analysis of mental health literacy interventions for children and youth. School Psychology Review, 52(2), 144–158. https://doi.org/10.1080/2372966X.2021.2018918
  • Methley, A. M., Campbell, S., Chew-Graham, C., McNally, R., & Cheraghi-Sohi, S. (2014). PICO, PICOS and SPIDER: A comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Services Research, 14(1), 579. https://doi.org/10.1186/s12913-014-0579-0
  • National Association of School Psychologists. (2020). Model for comprehensive and integrated school psychological services: The NASP practice model. National Association of School Psychologists. https://www.nasponline.org/standards-and-certification/nasp-practice-model
  • Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Systematic Reviews, 10(1), 89. https://doi.org/10.1186/s13643-021-01626-4
  • Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … McKenzie, J. E. (2020). PRISMA 2020 explanation and elaboration: Updated guidance and exemplars for reporting systematic reviews. British Medical Journal, 372: n160–36. https://doi.org/10.1136/bmj.n160
  • Pearson, K. (1904). Report on certain enteric fever inoculation statistics. British Medical Journal, 2(2288), 1243–1246. https://doi.org/10.1136/bmj.2.2288.1243
  • Polanin, J. R., Espelage, D. L., & Pigott, T. D. (2012). A meta-analysis of school-based bullying prevention programs’ effects on bystander intervention behavior. School Psychology Review, 41(1), 47–65. https://doi.org/10.1080/02796015.2012.12087375
  • Polanin, J. R., Pigott, T. D., Espelage, D. L., & Grotpeter, J. K. (2019). Best practice guidelines for abstract screening large‐evidence systematic reviews and meta‐analyses. Research Synthesis Methods, 10(3), 330–342. https://doi.org/10.1002/jrsm.1354
  • Robinson, K. A., Akinyede, O., Dutta, T., Sawin, V. I., Li, T., Spencer, M. R., Turkelson, C. M., & Weston, C. (2013). Framework for determining research gaps during systematic review: Evaluation methods research report. Johns Hopkins University Evidence-based Practice Center: Agency for Healthcare Research and Quality. https://www.ncbi.nlm.nih.gov/books/NBK126708/?report=reader#!po=12.5000
  • Royal Society. (2018, September 19). Evidence synthesis for policy. https://royalsociety.org/topics-policy/projects/evidence-synthesis/
  • Schnorrbusch, C., Fabiano, G. A., Aloe, A. M., & Toro Rodriguez, R. C. (2020). Attention deficit hyperactivity disorder and relative age: A meta-analysis. School Psychology Review, 49(1), 2–19. https://doi.org/10.1080/2372966X.2020.1717368
  • Siddaway, A. P., Wood, A. M., & Hedges, L. V. (2019). How to do a systematic review: A best practice guide for conduc­ting and reporting narrative reviews, meta-analyses, and ­meta-syntheses. Annual Review of Psychology, 70, 747–770. https://doi.org/10.1146/annurev-psych-010418-102803
  • Stanley, T. D., & Doucouliagos, H. (2014). Meta-regression ­approximations to reduce publication selection bias. Research Synthesis Methods, 5(1), 60–78. https://doi.org/10.1002/jrsm.1095
  • Swanson, H. L., & Malone, S. (1992). Social skills and learning disabilities: A meta-analysis of the literature. School Psychology Review, 21(3), 427–443. https://doi.org/10.1080/02796015.1992.12085627
  • Tawfik, G. M., Dila, K. A. S., Mohamed, M. Y. F., Tam, D. N. H., Kien, N. D., Ahmed, A. M., & Huy, N. T. (2019). A step by step guide for conducting a systematic review and meta-analysis with simulation data. Tropical Medicine and Health, 47(1), 46. https://doi.org/10.1186/s41182-019-0165-6
  • ten Bokkel, I. M., Roorda, D. L., Maes, M., Verschueren, K., & Colpin, H. (2023). The role of affective teacher–student relationships in bullying and peer victimization: A multilevel meta-analysis. School Psychology Review, 52(2), 110–129. https://doi.org/10.1080/2372966X.2022.2029218
  • Torgal, C., Espelage, D. L., Polanin, J. R., Ingram, K. M., Robinson, L. E., El Sheikh, A. J., & Valido, A. (2023). A meta-analysis of school-based cyberbullying prevention programs’ impact on cyber-bystander behavior. School Psychology Review, 52(2), 95–109. https://doi.org/10.1080/2372966X.2021.1913037
  • van Aert, R. C. M., Wicherts, J. M., & van Assen. M. A. (2019). Publication bias examined in meta-analyses from psychology and medicine: A meta-meta-analysis. PLoS ONE, 14(4), 1–32. https://doi.org/10.1371/journal.pone.0215052
  • Van Camp, A. M., Wehby, J. H., Martin, B. L. N., Wright, J. R., & Sutherland, K. S. (2020). Increasing opportunities to respond to intensify academic and behavioral interventions: A meta-analysis. School Psychology Review, 49(1), 31–46. https://doi.org/10.1080/2372966X.2020.1717369
  • Villarreal, V., Castro-Villarreal, F., Peterson, L. S., Bear, M., Cortes, D. M., & Escobedo, T. (2023). Meta-analysis of proportions of students screened and identified in mental health multiple-gate screening research. School Psychology Review, 52(2), 130–143. https://doi.org/10.1080/2372966X.2022.2106155
  • Welch, V., Petticrew, M., Petkovic, J., Moher, D., Waters, E., White, H., & Tugwell, P. (2015). Extending the PRISMA statement to equity-focused systematic reviews (PRISMA-E 2012): Explanation and elaboration. International Journal for Equity in Health, 14, 92. https://doi.org/10.1186/s12939-015-0219-2
  • Zakszeski, B., & Rutherford, L. (2021). Mind the gap: A systematic review of research on restorative practices in schools. School Psychology Review, 50(2–3), 371–387. https://doi.org/10.1080/2372966X.2020.1852056

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.