897
Views
8
CrossRef citations to date
0
Altmetric
REVIEW OF SINGLE-CASE EXPERIMENTAL DESIGN META-ANALYSES

A systematic review of single-case experimental design meta-analyses: characteristics of study designs, data, and analyses

, , , , , , , & show all

References

  • Alresheed, F., Hott, B. L., & Bano, C. (2013). Single subject research: A synthesis of analytic methods. Journal of Special Education Apprenticeship, 2(1), 1–18. https://scholarworks.lib.csusb.edu/josea/vol2/iss1/1
  • Alresheed, F., & Machalicek, W. (2022). A systematic comparison of non-overlap metrics and visual analysis in single-case experimental designs. Evidence-Based Communication Assessment and Intervention.
  • Alstot, A. E., Kang, M., & Alstot, C. D. (2013). Effects of interventions based in behavior analysis on motor skill acquisition: A meta-analysis. The Physical Educator, 70 (2), 155–186. http://ip-50-63-173-192.ip.secureserver.net:2948/ehost/detail?sid=e9c72a50-17c8-4435-a682-42b9d8273cab%40sessionmgr111&vid=3&hid=110&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3D%3D#db=a9h&AN=90037815
  • Beretvas, S. N., & Chung, H. (2008a). A review of meta-analyses of single-subject experimental designs: Methodological issues and practice. Evidence-Based Communication Assessment and Intervention, 2(3), 129–141. https://doi.org/10.1080/17489530802446302
  • Beretvas, S. N., & Chung, H. (2008b). An evaluation of modified R 2-change effect size indices for single-subject experimental designs. Evidence-Based Communication Assessment and Intervention, 2(3), 120–128. https://doi.org/10.1080/17489530802446328
  • Borckardt, J. J., & Nash, M. R. (2014). Simulation modelling analysis for small sets of single-subject data collected over time. Neuropsychological Rehabilitation, 24(3–4), 492–506. https://doi.org/10.1080/09602011.2014.895390
  • Bowman-Perrott, L., Davis, H. S., Vannest, K. J., Williams, L., Greenwood, C., & Parker, R. I. (2013). Academic benefits of peer tutoring: A meta-analytic review of single-case research. School Psychology Review, 42(1), 39–55. https://doi.org/10.1016/j.jcomdis.2015.06.009
  • Campbell, J. M. (2004). Statistical comparison of four effect sizes for single-subject designs. Behavior Modification, 28(2), 234–246. https://doi.org/10.1177/0145445503259264
  • Cheung, M.-W.-L. (2014). Modeling dependent effect sizes with three-level meta-analyses: A structural equation modeling approach. Psychological Methods, 19(2), 211–229. https://doi.org/10.1037/a0032968
  • Farmer, J. L., Owens, C. M., Ferron, J. M., & Allsopp, D. H. (2010). A methodological review of single-case meta-analyses [Paper presentation]. The American Educational Research Association, Denvor.
  • Fingerhut, J., Xu, X., & Moeyaert, M. (2021). Selecting the proper Tau-U measure for single-case experimental designs: Development and application of a decision flowchart. Evidence-Based Communication Assessment and Intervention, 15(3), 99–114. https://doi.org/10.1080/17489539.2021.1937851
  • Gierut, J. A., Morrisette, M. L., & Dickinson, S. L. (2015). Effect size for single-subject design in phonological treatment. Journal of Speech, Language, and Hearing Research, 58(5), 1464–1481. https://doi.org/10.1044/2015
  • Gingerich, W. J. (1984). Meta-analysis of applied time- series data. The Journal of Applied Behavioral Science, 20(1), 71–79. https://doi.org/10.1177/002188638402000113
  • Harrington, M. A. (2013). Comparing visual and statistical analysis in single- subject studies [ Open Access dissertations]. http://digitalcommons.uri.edu/oa_diss/5
  • Hedges, L. V., Tipton, E., & Johnson, M. C. (2010). Robust variance estimation in meta-regression with dependent effect size estimates. Research Synthesis Methods, 1(1), 39–65. https://doi.org/10.1002/jrsm.5
  • Heyvaert, M., Moeyaert, M., Verkempynck, P., Van den Noortgate, W., Vervloet, M., Ugille, M., & Onghena, P. (2017). Testing the intervention effect in single-case experiments: A monte carlo simulation study. Journal of Experimental Education, 85(2), 175–196. https://doi.org/10.1080/00220973.2015.1123667
  • Jamshidi, L., Heyvaert, M., Declercq, L., Fernández-Castilla, B., Ferron, J. M., Moeyaert, M., Van den Noortgate, W., Van den Noortgate W. (2018). Methodological quality of meta-analyses of single-case experimental studies. Research in Developmental Disabilities, 79, 97–115. https://doi.org/10.1016/j.ridd.2017.12.016
  • Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case design technical documentation. https://doi.org/10.1037/e578392011-004
  • Kung, J., Chiappelli, F., Cajulis, O. O., Avezova, R., Kossan, G., Chew, L., & Maida, C. A. (2010). From systematic reviews to clinical recommendations for evidence-based health care: Validation of revised assessment of multiple systematic reviews (R-AMSTAR) for grading of clinical relevance. The Open Dentistry Journal, 4(1), 84–91. https://doi.org/10.2174/1874210601004020084
  • Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Sage.
  • Little, S. G., Akin-Little, A., & O’Neill, K. (2015). Group contingency interventions with children—1980-2010: A meta-analysis. Behavior Modification, 39(2), 322–341. https://doi.org/10.1177/0145445514554393
  • Ma, H.-H. (2006). An alternative method for quantitative synthesis of single-subject researches: Percentage of data points exceeding the median. Behavior Modification, 30(5), 598–617. https://doi.org/10.1177/0145445504272974
  • Maggin, D. M., Swaminathan, H., Rogers, H. J., O’Keeffe, B. V., Sugai, G., & Horner, R. H. (2011). A generalized least squares regression approach for computing effect sizes in single-case research: Application examples. Journal of School Psychology, 49(3), 301–321. https://doi.org/10.1016/j.jsp.2011.03.004
  • Maggin, D. M., O’Keeffe, B. V., & Johnson, A. H. (2011). A quantitative synthesis of methodology in the meta-analysis of single-subject research for students with disabilities: 1985–2009. Exceptionality: A Special Education Journal, 19(2), 109–135. https://doi.org/10.1080/09362835.2011.565725
  • Maggin, D. M., Briesch, A. M., & Chafouleas, S. M. (2013). An application of the what works clearinghouse standards for evaluating single-subject research: Synthesis of the self-management literature base. Remedial and Special Education, 34(1), 44–58. https://doi.org/10.1177/0741932511435176
  • Manolov, R., & Solanas, A. (2008). Comparing N = 1 effect size indices in presence of autocorrelation. Behavior Modification, 32(6), 860–875. https://doi.org/10.1177/0145445508318866
  • Manolov, R., & Moeyaert, M. (2017). Recommendations for choosing single-case data analytical techniques. Behavior Therapy, 48(1), 97–114. https://doi.org/10.1016/j.beth.2016.04.008
  • Manolov, R., Moeyaert, M., & Fingerhut, J. E. (2022). A Priori justification for effect measures in single-case experimental designs. Perspectives on Behavior Science, 45(1), 153–186. https://doi.org/10.1007/s40614-021-00282-2
  • Marín-Martínez, F., & Sánchez-Meca, J. (1999). Averaging dependent effect sizes in meta-analysis: A cautionary note about procedures. The Spanish Journal of Psychology, 2(1), 32–38. https://doi.org/10.1017/S1138741600005436
  • Maughan, D. R., Christiansen, E., Jenson, W. R., Olympia, D., & Clark, E. Behavioral parent training as a treatment for externalizing behaviors and disruptive behavior disorders: A meta-analysis. (2005). School Psychology Review, 34(3), 267–286. https://doi.org/10.1080/02796015.2005.12086287
  • Miller, F. G. (2011). Do functional behavioral assessment improve intervention effectiveness for students with ADHD? A single-subject meta-analysis [ PhD thesis]. The Pennsylvania State University. https://www.proquest.com/docview/906298983?pq-origsite=gscholar&fromopenview=true.
  • Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014). The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research. Behavior Modification, 38(5), 665–704. https://doi.org/10.1177/0145445514535243
  • Moeyaert, M., Ugille, M., Beretvas, S. N., Ferron, J. M., Bunuan, R., & Van den Noortgate, W. (2016). Methods for dealing with multiple outcomes in meta-analysis: A comparison between averaging effect sizes, robust variance estimation and multilevel meta-analysis. International Journal of Social Research Methodology, 20(6), 559–572. https://doi.org/10.1080/13645579.2016.1252189
  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., Group, T. P., & The PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Medicine, 6(7), 1–6. https://doi.org/10.1136/bmj.b2535
  • Morgan, D. L., & Morgan, R. K. (2001). Single-participant research design: Bringing science to managed care. The American Psychologist, 56(2), 119–127. https://doi.org/10.1037//0003-066X.56.2.119
  • Onghena, P., & Edgington, E. S. Customization of pain treatments: Single-case design and analysis. (2005). The Clinical Journal of Pain, 21(1), 56–68; discussion 69–72. https://doi.org/10.1097/00002508-200501000-00007
  • Owens, C. M. (2011). Meta-analysis of single-case data: A monte carlo investigation of a three level model [ Graduate thesis and dissertation]. http://scholarcommons.usf.edu/etd/3273
  • Parker, R. I., Hagan-Burke, S., & Vannest, K. J. (2007). Percentage of all non-overlapping data (PAND): An alternative to PND. The Journal of Special Education, 40(4), 194–204. https://doi.org/10.1177/00224669070400040101
  • Parker, R. I., Vannest, K. J., & Brown, L. (2009). The improvement rate difference for single-case research. Exceptional Children, 75(2), 135–150. https://doi.org/10.1177/001440290907500201
  • Parker, R. I., Vannest, K. J., Davis, J. L., & Sauber, S. B. (2011). Combining nonoverlap and trend for single-case research: Tau-U. Behavior Therapy, 42(2), 284–299. https://doi.org/10.1016/j.beth.2010.08.006
  • Parker, R. I., Vannest, K. J., & Davis, J. L. (2011). Effect size in single-case research: A review of nine nonoverlap techniques. Behavior Modification, 35(4), 303–322. https://doi.org/10.1177/0145445511399147
  • Petit-Bois, M., Baek, E., Van den Noortgate, W., Beretvas, S. N., & Ferron, J. M. (2016). The consequences of modeling autocorrelation when synthesizing single-case studies using a three-level model. Behavior Research Methods, 48(2), 803–812. https://doi.org/10.3758/s13428-015-0612-1
  • Rakap, S., Snyder, P., & Pasia, C. (2014). Comparison of nonoverlap methods for identifying treatment effect in single-subject experimental research. Behavioral Disorders, 39(3), 128–145. https://doi.org/10.1177/019874291303900303
  • Rosenthal, R. (1994). Parametric measures of effect size. In H. Coope & L. V. Hedges (Eds.), The handbook of research senthesis (pp. 231–244). Russell Sage.
  • Scammacca, N., Roberts, G., & Stuebing, K. K. (2014). Meta-Analysis with complex research designs dealing with dependence from multiple measures and multiple group comparisons. Review of Educational Research, 84(3), 328–364. https://doi.org/10.3102/0034654313500826
  • Schlosser, R. W., Lee, D. L., & Wendt, O. (2008). Application of the percentage of non-overlapping data (PND) in systematic reviews and meta-analyses: A systematic review of reporting characteristics. Evidence-Based Communication Assessment & Intervention, 2(3), 163–187. https://doi.org/10.1080/17489530802505412
  • Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single-subject research: Methodology and validation. Remedial and Special Education, 8(2), 24–33. https://doi.org/10.1177/074193258700800206
  • Shadish, W. R., & Rindskopf, D. M. (2007). Methods for evidence-based practice: Quantitative synthesis of single-subject designs. New Direction for Evaluation, 2007(113), 95–109. https://doi.org/10.1002/ev.217
  • Shadish, W. R., Hedges, L. V., & Pustejovsky, J. E. (2014). Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. Journal of School Psychology, 52(2), 123–147. https://doi.org/10.1016/j.jsp.2013.11.005
  • Shadish, W. R. (2014a). Analysis and meta-analysis of single-case designs: An introduction. Journal of School Psychology, 52(2), 109–122. https://doi.org/10.1016/j.jsp.2013.11.009
  • Shadish, W. R. (2014b). Statistical analyses of single-case designs : The shape of things to come. Current Directions in Psychological Science, 23(2), 139–146. https://doi.org/10.1177/0963721414524773
  • Shadish, W. R., Hedges, L. V., Horner, R. H., & Odom, S. L. (2015). The role of between-case effect size in conducting, interpreting, and summarizing single-case research(ncer 2015-002). National Center for Education Research, Institute of Education. http://ies.ed.gov/ncser/pubs/2015002/
  • Shea, B. J., Grimshaw, J. M., Wells, G. A., Boers, M., Andersson, N., Hamel, C., Bouter, L. M., Ashley C. Porter, Peter Tugwell, Bouter, L. M. (2007). Development of AMSTAR: A measurement tool to assess the methodological quality of systematic reviews. BMC Medical Research Methodology, 7(10), 1–7. https://doi.org/10.1186/1471-2288-7-10
  • Smith, J. D. (2012). Single-Case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17(4), 1–70. https://doi.org/10.1037/a0029312
  • Solomon, B. G., Klein, S. A., Hintze, J. M., Cressey, J. M., & Peller, S. L. (2012). A meta-analysis of school-wide positive behavior support: An exploratory study using single-case synthesis. Psychology in the School, 49(2), 274–283. https://doi.org/10.1002/pits.20625
  • Swanson, H. L., & Sachse Lee, C. M. (2000). A meta-analysis of single-subject-design intervention research for students with LD. Journal of Learning Disabilities, 33(2), 114–136. https://doi.org/10.1177/002221940003300201
  • Tincani, M., & De Mers, M. (2016). Meta-Analysis of single-case research design studies on instructional pacing. Behavior Modification, 40(6), 799–824. https://doi.org/10.1177/0145445516643488
  • Van den Noortgate, W., & Onghena, P. (2003a). Combining single-case experimental data using hierarchical linear models. School Psychology Quarterly, 18(3), 325–346. https://doi.org/10.1521/scpq.18.3.325.22577
  • Van den Noortgate, W., & Onghena, P. (2003b). Hierarchical linear models for the quantitative integration of effect sizes in single-case research. Behavior Research Methods, Instruments, & Computers, 35(1), 1–10. https://doi.org/10.3758/BF03195492
  • Van den Noortgate, W., & Onghena, P. (2008). A multilevel meta-analysis of single-subject experimental design studies. Evidence-Based Communication Assessment and Intervention, 2(3), 142–151. https://doi.org/10.1080/17489530802505362
  • Van den Noortgate, W., López-López, J. A., Marín-Martínez, F., & Sánchez-Meca, J. (2013). Three-Level meta-analysis of dependent effect sizes. Behavior Research Methods, 45(2), 576–594. https://doi.org/10.3758/s13428-012-0261-6
  • Van den Noortgate, W., López-lópez, J. A., Marín-Martínez, F., & Sánchez-Meca, J. (2015). Meta-Analysis of multiple outcomes: A multilevel approach. Behavior Research Methods, 47(4), 1274–1294. https://doi.org/10.3758/s13428-014-0527-2
  • Vannest, K. J., & Ninci, J. (2015). Evaluating intervention effects in single-case research designs. Journal of Counseling and Development, 93(4), 403–411. https://doi.org/10.1002/jcad.12038
  • Vegas, K. C., Jenson, W. R., & Kircher, J. C. (2007). A single-subject meta-analysis of the effectiveness of time-out in reducing disruptive classroom behavior. Behavioral Disorders, 32(2), 109–121. https://doi.org/10.1177/019874290703200204
  • Wakefield, J. A., Jr. (1980). Relationship between two expressions of reliability: Percentage agreement and Phi. Educational and Psychological Measurement, 40(3), 593–597. https://doi.org/10.1177/001316448004000304
  • Weng, P.-L., Maeda, Y., & Bouck, E. C. (2014). Effectiveness of cognitive skills-based computer-assisted instruction for students with disabilities: A synthesis. Remedial and Special Education, 35(3), 167–180. https://doi.org/10.1177/0741932513514858
  • Wolery, M., Busick, M., Reichow, B., & Barton, E. E. (2010). Comparison of overlap methods for quantitatively synthesizing single-subject data. The Journal of Special Education, 44(1), 18–28. https://doi.org/10.1177/0022466908328009
  • Zhang, J., & Wheeler, J. J. (2011). A meta-analysis of peer-mediated interventions for young children with autism spectrum disorders. Education and Training in Autism and Developmental Disabilities, 46(1), 62–77. https://doi.org/10.1007/s40489-014-0014-9
  • Zheng, X., Flynn, L. J., & Swanson, H. L. (2013). Experimental intervention studies on word problem solving and math disabilities: A selective analysis of the literature. Learning Disability Quarterly, 36(2), 97–111. https://doi.org/10.1177/0731948712444277

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.