Skip to main content
Log in

Quantitative integration of single-subject studies: Methods and misinterpretations

  • In Response
  • Published:
The Behavior Analyst Aims and scope Submit manuscript

Abstract

Derenne and Baron (1999) criticized a quantitative literature review by Kollins, Newland, and Critchfield (1997) and raised several important issues with respect to the integration of single-subject data. In their criticism they argued that the quantitative integration of data across experiments conducted by Kollins et al. is a meta-analysis and, as such, is inappropriate. We reply that Kollins et al. offered behavior analysts a technique for integrating quantitative information in a way that draws from the strengths of behavior analysis. Although the quantitative technique is true to the original spirit of meta-analysis, it bears little resemblance to meta-analyses as currently conducted or defined and offers behavior analysts a potentially useful tool for comparing data from multiple sources. We also argue that other criticisms raised by Derenne and Baron were inaccurate or irrelevant to the original article. Our response highlights two main points: (a) There are meaningful quantitative techniques for examining single-subject data across studies without compromising the integrity of behavior analysis; and (b) the healthiest way to refute or question findings in any viable field of scientific inquiry is through empirical investigation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Allison, D. B., & Gorman, B. S. (1993). Calculating effect sizes for meta-analysis: The case of the single case. Behaviour Research, & Therapy, 31, 621–631.

    Article  Google Scholar 

  • Baron, A., & Galizio, M. (1983). Instructional control of human operant behavior. The Psychological Record, 33, 495–520.

    Google Scholar 

  • Baron, A., & Perone, M. (1982). The place of the human subject in the operant laboratory. The Behavior Analyst, 5, 143–158.

    PubMed  PubMed Central  Google Scholar 

  • Baron, A., Perone, M., & Galizio, M. (1991a). Analyzing the reinforcement process at the human level: Can application and behavioristic interpretation replace laboratory research? The Behavior Analyst, 14, 95–105.

    PubMed  PubMed Central  Google Scholar 

  • Baron, A., Perone, M., & Galizio, M. (1991b). The experimental analysis of human behavior: Indispensable, ancillary, or irrelevant? The Behavior Analyst, 14, 145–155.

    PubMed  PubMed Central  Google Scholar 

  • Baum, W. M. (1974). On two types of deviation from the matching law: Bias and undermatching. Journal of the Experimental Analysis of Behavior, 22, 231–242.

    Article  PubMed  PubMed Central  Google Scholar 

  • Baum, W. M. (1979). Matching, undermatching, and overmatching in studies of choice. Journal of the Experimental Analysis of Behavior, 32, 269–281.

    Article  PubMed  PubMed Central  Google Scholar 

  • Branch, M. N. (1991). On the difficulty of studying “basic” behavioral processes in humans. The Behavior Analyst, 14, 107–110.

    PubMed  PubMed Central  Google Scholar 

  • Busk, P. L., & Serlin, R. C. (1992). Meta-analysis for single-case research. In T. R. Kratochwill & J. R. Levin (Eds.), Single case research design and analysis: New directions for psychology and education (pp. 187–212). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Cleveland, W. S. (1985). The elements of graphing data. Pacific Grove, CA: Wadsworth and Brooks/Cole.

    Google Scholar 

  • Davison, M., & McCarthy, D. (1988). The matching law. Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Derenne, A., & Baron, A. (1999). Human sensitivity to reinforcement: A comment on Kollins, Newland, and Critchfield’s (1997) quantitative literature review. The Behavior Analyst, 22, 35–42.

    PubMed  PubMed Central  Google Scholar 

  • Dougherty, D. M. (1994). The selective renaissance of the experimental analysis of human behavior. The Behavior Analyst, 17, 169–174.

    PubMed  PubMed Central  Google Scholar 

  • Gingerich, W. J. (1984). Meta-analysis of applied time-series data. Journal of Applied Behavioral Science, 20, 71–79.

    Article  PubMed  Google Scholar 

  • Glass, G. V., McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research. Beverly Hills, CA: Sage.

    Google Scholar 

  • Herrnstein, R. J. (1970). On the law of effect. Journal of the Experimental Analysis of Behavior, 13, 243–266.

    Article  PubMed  PubMed Central  Google Scholar 

  • Horne, P. J., & Lowe, C. F. (1993). Determinants of human performance on concurrent schedules. Journal of the Experimental Analysis of Behavior, 59, 29–60.

    Article  PubMed  PubMed Central  Google Scholar 

  • Hunter, J. E., & Schmidt, F. L. (1990). Methods of meta-analysis: Correcting error and bias in research findings. Newbury Park, CA: Sage.

    Google Scholar 

  • Hyten, C., & Madden, G. J. (1993). The scallop in human fixed-interval research: A review of problems with data description. The Psychological Record, 43, 471–500.

    Google Scholar 

  • Hyten, C., & Reilly, M. P. (1992). The renaissance of the experimental analysis of human behavior. The Behavior Analyst, 15, 109–114.

    PubMed  PubMed Central  Google Scholar 

  • Kollins, S. H., Newland, M. C. & Critchfield, T. S. (1997). Human sensitivity to reinforcement in operant choice: How much do consequences matter? Psychonomic Bulletin and Review, 4, 208–220. Erratum: Psychomonic Bulletin and Review, 4, 431.

    Article  PubMed  Google Scholar 

  • Mosteller, F. (1990). The future of meta-analysis. In K. W. Wachter & M. L. Straf (Eds.), The future of meta-analysis (pp. 185–190). New York. Russell Sage Foundation.

    Google Scholar 

  • Myers, D. L., & Myers, L. E. (1977). Under-matching: A reappraisal of performance on concurrent variable-interval schedules of reinforcement. Journal of the Experimental Analysis of Behavior, 27, 203–214.

    Article  PubMed  PubMed Central  Google Scholar 

  • Newland, M. C. (1997). Quantifying the molecular structure of behavior: Separate effects of caffeine, cocaine, and adenosine agonists on interresponse times and lever press durations. Behavioural Pharmacology, 8, 1–16.

    PubMed  Google Scholar 

  • Perone, M. (1985). On the impact of human operant research: Asymmetrical patterns of cross-citation between human and nonhuman research. The Behavior Analyst, 8, 185–189.

    PubMed  PubMed Central  Google Scholar 

  • Pierce, W. D., & Epling, W. F. (1983). Choice, matching, and human behavior: A review of the literature. The Behavior Analyst, 6, 57–76.

    PubMed  PubMed Central  Google Scholar 

  • Salzburg, C. L., Strain, P. S., & Baer, D. M. (1987). Meta-analysis for single subject research: When does it clarify? When does it obscure? Remedial and Special Education, 8, 43–48.

    Article  Google Scholar 

  • Scruggs, T. E., & Mastropieri, M. A. (1998). Summarizing single-subject research: Issues and applications. Behavior Modification, 22, 221–242.

    Article  PubMed  Google Scholar 

  • Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987a). The quantitative synthesis of single subject research: Methodology and validation. Remedial and Special Education, 8, 24–33.

    Article  Google Scholar 

  • Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987b). “The quantitative synthesis of single subject research: Methodology and validation”: Reply to Owen White. Remedial and Special Education, 8, 40–42.

    Article  Google Scholar 

  • Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987c). “The quantitative synthesis of single subject research: Methodology and validation”: Response to Salzberg, Strain & Baer. Remedial and Special Education, 8, 49–52.

    Article  Google Scholar 

  • Shull, R. L., & Lawrence, P. S. (1991). Preparations and principles. The Behavior Analyst, 14, 133–138.

    PubMed  PubMed Central  Google Scholar 

  • Sidman, M. (1960). Tactics of scientific research. New York: Free Press.

    Google Scholar 

  • Takahashi, M., & Iwamoto, T. (1986). Human concurrent performances: The effects of experience, instructions, and schedule-correlated stimuli. Journal of the Experimental Analysis of Behavior, 45, 257–267.

    Article  PubMed  PubMed Central  Google Scholar 

  • Wachter, K. W., & Straf, M. L. (1990). The future of meta-analysis. New York. Russell Sage Foundation.

    Google Scholar 

  • Wanchisen, B. A., Tatham, T. A., & Mooney, S. E. (1989). Variable-ratio conditioning history produces high- and low-rate fixed-interval performance in rats. Journal of the Experimental Analysis of Behavior, 52, 167–179.

    Article  PubMed  PubMed Central  Google Scholar 

  • Weiner, H. (1969). Controlling human fixed-interval performance. Journal of the Experimental Analysis of Behavior, 12, 349–373.

    Article  PubMed  PubMed Central  Google Scholar 

  • Weiner, H. (1972). Controlling human fixed-interval performance with fixed-ratio responding or differential reinforcement of low-rate responding in mixed schedules. Psychonomic Science, 26, 191–192.

    Article  Google Scholar 

  • White, D. M. (1987). “The quantitative synthesis of single subject research: Methodology and validation” comment. Remedial and Special Education, 8, 34–39.

    Article  Google Scholar 

  • White, D. M, Rusch, F. R., Kazdin, A. E., & Hartmann, D. P. (1989). Applications of meta-analysis in individual subject research. Behavioral Assessment, 11, 281–296.

    Google Scholar 

  • Zeiler, M. D. (1984). The sleeping giant: Reinforcement schedules. Journal of the Experimental Analysis of Behavior, 42, 485–493.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Scott H. Kollins.

Additional information

All authors contributed equally to this work. Order of authorship on this paper was thus determined on the basis of order from the previously published paper. Work on this manuscript was supported in part by a Faculty Research Development Award from Western Michigan University (S.H.K.) and Grant ES 06466 from the National Institute of Environmental Health Sciences (M.C.N.).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kollins, S.H., Newland, M.C. & Critchfield, T.S. Quantitative integration of single-subject studies: Methods and misinterpretations. BEHAV ANALYST 22, 149–157 (1999). https://doi.org/10.1007/BF03391992

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF03391992

Navigation