Abstract
Reliable and accurate visual analysis of graphically depicted behavioral data acquired using single-case experimental designs (SCEDs) is integral to behavior-analytic research and practice. Researchers have developed a range of techniques to increase reliable and objective visual inspection of SCED data including visual interpretive guides, statistical techniques, and nonstatistical quantitative methods to objectify the visual-analytic interpretation of data to guide clinicians, and ensure a replicable data interpretation process in research. These structured data analytic practices are now more frequently used by behavior analysts and the subject of considerable research within the field of quantitative methods and behavior analysis. First, there are contemporaneous analytic methods that have preliminary support with simulated datasets, but have not been thoroughly examined with nonsimulated clinical datasets. There are a number of relatively new techniques that have preliminary support (e.g., fail-safe k), but require additional research. Other analytic methods (e.g., dual-criteria and conservative dual criteria) have more extensive support, but have infrequently been compared against other analytic methods. Across three studies, we examine how these methods corresponded to clinical outcomes (and one another) for the purpose of replicating and extending extant literature in this area. Implications and recommendations for practitioners and researchers are discussed.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig1_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig2_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig3_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig4_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig5_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig6_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs40614-021-00313-y/MediaObjects/40614_2021_313_Fig7_HTML.png)
Similar content being viewed by others
References
Allison, D. B., & Gorman, B. S. (1993). Powcor: A power analysis and sample size program for testing differences between dependent and independent correlations. Educational & Psychological Measurement, 53(1), 133–137. https://doi.org/10.1177/0013164493053001014
Barlow, D. H., Nock, M. K., & Hersen, M. (2009). Single-case experimental designs: Strategies for studying behavior change. Allyn & Bacon.
Barnard-Brak, L., Richman, D. M., Little, T. D., & Yang, Z. (2018). Development of an in-vivo metric to aid visual analysis of single-case design data: Do we need to run more sessions? Behaviour Research & Therapy, 102, 8–15. https://doi.org/10.1016/j.brat.2017.12.003
Barnard-Brak, L., Watkins, L., & Richman, D. M. (2021). Autocorrelation and estimates of treatment effect size for single-case experimental design data. Behavioral Interventions. Advance online publication. https://doi.org/10.1002/bin.1783
Carr, J. E., Severtson, J. M., & Lepper, T. L. (2009). Noncontingent reinforcement is an empirically supported treatment for problem behavior exhibited by individuals with developmental disabilities. Research in Developmental Disabilities, 30(1), 44–57. https://doi.org/10.1016/j.ridd.2008.03.002
Cox, D. J., & Brodhead, M. T. (2021). A proof of concept analysis of decision-making with time-series data. The Psychological Record. Advance online publication. https://doi.org/10.1007/s40732-020-00451-w
Falligant, J. M., McNulty, M. K., Hausman, N. L., & Rooker, G. W. (2019). Using dual-criteria methods to supplemental visual inspection: Replication and inspection. Journal of Applied Behavior Analysis, 53(3), 1789–1798. https://doi.org/10.1002/jaba.665
Falligant, J. M., & Vetter, J. A. (2020). Quantifying false positives in simulated events using partial interval recording and momentary time sampling with dual-criteria methods. Behavioral Interventions, 35(2), 281–294. https://doi.org/10.1002/bin.1707
Falligant, J. M., Kranak, M. P., Schmidt, J. S., & Rooker, G. W. (2020a). Correspondence between fail-safe k and dual-criteria methods: Analysis of data series stability. Perspectives on Behavior Science, 43(2), 303–319. https://doi.org/10.1007/s40614-020-00255-x
Falligant, J. M., McNulty, M. K., Kranak, M. P., Hausman, N. L., & Rooker, G. W. (2020b). Evaluating sources of baseline data using dual-criteria and conservative dual-criteria methods: A quantitative analysis. Journal of Applied Behavior Analysis, 53(4), 2230–2338. https://doi.org/10.1002/jaba.710
Fisher, W. W., & Lerman, D. C. (2014). It has been said that, “There are three degrees of falsehoods: Lies, damn lies, and statistics”. Journal of School Psychology, 52(2), 243–248. https://doi.org/10.1016/j.jsp.2014.01.001
Fisher, W. W., Kelley, M. E., & Lomas, J. E. (2003). Visual aids and structured criteria for improving visual analysis and interpretation of single-case designs. Journal of Applied Behavior Analysis, 36(3), 387–406. https://doi.org/10.1901/jaba.2003.36-387
Gottman, J. M. (1981). Time-series analysis: A comprehensive introduction for social scientists. Cambridge University Press.
Hagopian, L. P., Fisher, W. W., Thompson, R. H., Owen-DeSchryver, J., & Wacker, D. P. (1997). Toward the development of structured criteria for interpretation of functional analysis data. Journal of Applied Behavior Analysis, 30(2), 313–326. https://doi.org/10.1901/jaba.1997.30-313
Hagopian, L. P. (2020). The consecutive controlled case series: Design, data-analytics, and reporting methods supporting the study of generality. Journal of Applied Behavior Analysis, 53(2), 596–619. https://doi.org/10.1002/jaba.691
Hall, S. S., Pollard, J. S., Monlux, K. D., & Baker, J. M. (2020). Interpreting functional analysis outcomes using automated nonparametric statistical analysis. Journal of Applied Behavior Analysis, 53(2), 1177–1191. https://doi.org/10.1002/jaba.689
Harrington, M., & Velicer, W. F. (2015). Comparing visual and statistical analysis in single-case studies using published studies. Multivariate Behavioral Research, 50(2), 162–183. https://doi.org/10.1080/00273171.2014.973989
Joo, S. H., Ferron, J. M., Beretvas, S. N., Moeyaert, M., & Van den Noortgate, W. (2018). The impact of response-guided baseline phase extensions on treatment effect estimates. Research in Developmental Disabilities, 79, 77–87. https://doi.org/10.1016/j.ridd.2017.12.018
Kazdin, A. E. (2010). Single-case research designs: Methods for clinical and applied settings (2nd ed.). Oxford University Press.
Kranak, M. P., & Hall, S. S. (2021). Implementing automated nonparametric statistical analysis on functional analysis data: A guide for practitioners and researchers. Perspectives on Behavior Science. Advance online publication. https://doi.org/10.1007/s40614-021-00290-2
Kranak, M. P., Falligant, J. M., & Hausman, N. L. (2021). Application of automated nonparametric statistical analysis in clinical contexts. Journal of Applied Behavior Analysis, 54(2), 824–833. https://doi.org/10.1002/jaba.789
Kyonka, E. G., Mitchell, S. H., & Bizo, L. A. (2019). Beyond inference by eye: Statistical and graphing practices in JEAB, 1992–2017. Journal of the Experimental Analysis of Behavior, 111, 155–165. https://doi.org/10.1002/jeab.509
Lanovaz, M. J., Huxley, S. C., & Dufour, M. M. (2017). Using the dual-criteria methods to supplement visual analysis: An analysis of nonsimulated data. Journal of Applied Behavior Analysis, 50(3), 662–667. https://doi.org/10.1002/jaba.394
Lanovaz, M. J., Giannakakos, A. R., & Destras, O. (2020). Machine learning to analyze single-case data: A proof of concept. Perspectives on Behavior Science, 43(1), 21–38. https://doi.org/10.1007/s40614-020-00244-0
Ninci, J., Vannest, K. J., Willson, V., & Zhang, N. (2015). Interrater agreement between visual analysts of single-case data: A meta-analysis. Behavior Modification, 39(4), 510–541. https://doi.org/10.1177/0145445515581327
Peltier, C., Morano, S., Shin, M., Stevenson, N. A., & McKenna, J. (2021). A decade review of single-case graph construction in the field of learning disabilities. Learning Disabilities Research & Practice, 36(2), 121–135. https://doi.org/10.1111/ldrp.12245
Phillips, C. L., Iannaccone, J. A., Rooker, G. W., & Hagopian, L. P. (2017). Noncontingent reinforcement for the treatment of severe problem behavior: An analysis of 27 consecutive applications. Journal of Applied Behavior Analysis, 50(2), 357–376. https://doi.org/10.1002/jaba.376
Roane, H. S., Fisher, W. W., Kelley, M. E., Mevers, J. L., & Bouxsein, K. J. (2013). Using modified visual-inspection criteria to interpret functional analysis outcomes. Journal of Applied Behavior Analysis, 46(1), 130–146. https://doi.org/10.1002/jaba.13
Rooker, G. W., Jessel, J., Kurtz, P. F., & Hagopian, L. P. (2013). Functional communication training with and without alternative reinforcement and punishment: An analysis of 58 applications. Journal of Applied Behavior Analysis, 46(4), 708–722. https://doi.org/10.1002/jaba.76
Saini, V., Fisher, W. W., & Retzlaff, B. J. (2018). Predictive validity and efficiency of ongoing visual analysis criteria for interpreting functional analyses. Journal of Applied Behavior Analysis, 51(2), 303–320. https://doi.org/10.1002/jaba.450
Stewart, K. K., Carr, J. E., Brandt, C. W., & McHenry, M. M. (2007). An evaluation of the conservative dual-criterion method for teaching university students to visually inspect AB-design graphs. Journal of Applied Behavior Analysis, 40(4), 713–718. https://doi.org/10.1901/jaba.2007.713-718
Swan, D. M., Pustejovsky, J. E., & Beretvas, S. N. (2020). The impact of response-guided designs on count outcomes in single-case experimental design baselines. Evidence-Based Communication Assessment and Intervention, 14(1–2), 82–107. https://doi.org/10.1080/17489539.2020.1739048
Tarlow, K. R., & Brossart, D. F. (2018). A comprehensive method of single-case data analysis: Interrupted Time-Series Simulation (ITSSIM). School Psychology Quarterly, 33(4), 590–560. https://doi.org/10.1037/spq0000273
Wendt, O., & Rindskopf, D. (2020). Exploring new directions in statistical analysis of single-case experimental designs. Evidence-Based Communication Assessment & Intervention. Advance online publication. https://doi.org/10.1080/17489539.2020.1741842
Wolfe, K., Seaman, M. A., Drasgow, E., & Sherlock, P. (2018). An evaluation of the agreement between the conservative dual-criterion method and expert visual analysis. Journal of Applied Behavior Analysis, 51(2), 345–351. https://doi.org/10.1002/jaba.453
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors report no conflicts of interest and adhered to all applicable ethical standards.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Illustrative Examples of Clinical Outcomes Suggestive of Effective and Noneffective Treatments. Note. Hypothetical data series from clinical treatment evaluations corresponding to an intervention (“Treatment 1”) indicating a clinical effect (top panel) or not indicating a clinical effect (bottom panel). The A-B phase in the top panel would correspond to an outcome demonstrating a clinical effect for Treatment 1. The A-B phase in the bottom panel would correspond to an outcome failing to demonstrate a clinical effect for Treatment 1.
Rights and permissions
About this article
Cite this article
Falligant, J.M., Kranak, M.P. & Hagopian, L.P. Further Analysis of Advanced Quantitative Methods and Supplemental Interpretative Aids with Single-Case Experimental Designs. Perspect Behav Sci 45, 77–99 (2022). https://doi.org/10.1007/s40614-021-00313-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40614-021-00313-y