Skip to main content
Log in

Meta-analytic Decisions and Reliability: A Serendipitous Case of Three Independent Telecommuting Meta-analyses

  • Published:
Journal of Business and Psychology Aims and scope Submit manuscript

Abstract

Purpose

Despite the potential for researcher decisions to negatively impact the reliability of meta-analysis, very few methodological studies have examined this possibility. The present study compared three independent and concurrent telecommuting meta-analyses in order to determine how researcher decisions affected the process and findings of these studies.

Methodology

A case study methodology was used, in which three recent telecommuting meta-analyses were re-examined and compared using the process model developed by Wanous et al. (J Appl Psychol 74:259–264, 1989).

Findings

Results demonstrated important ways in which researcher decisions converged and diverged at stages of the meta-analytic process. The influence of researcher divergence on meta-analytic findings was neither evident in all cases, nor straightforward. Most notably, the overall effects of telecommuting across a range of employee outcomes were generally consistent across the meta-analyses, despite substantial differences in meta-analytic samples.

Implications

Results suggest that the effect of researcher decisions on meta-analytic findings may be largely indirect, such as when early decisions guide the specific moderation tests that can be undertaken at later stages. However, directly comparable “main effect” findings appeared to be more robust to divergence in researcher decisions. These results provide tentative positive evidence regarding the reliability of meta-analytic methods and suggest targeted areas for future methodological studies.

Originality

This study presents unique insight into a methodological issue that has not received adequate research attention, yet has potential implications for the reliability and validity of meta-analysis as a method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. A comprehensive summary of these methodological issues is beyond the scope of the current paper; interested readers can refer to recent overviews by Burke and Landis (2003) and Schulze (2007).

  2. The primary focus of our literature review was on studies of concurrent meta-analyses because they provide the most commensurate basis for studying meta-analytic decisions. Alternatively, meta-analytic updates are less useful for this purpose, having been conducted after a substantial body of new literature on a topic has accumulated.

  3. In addition, Ones et al. identified several computational errors in the moderator analyses conducted by Tett et al., although those particular analyses were not relevant to the discrepant findings noted above.

  4. Requests for additional information were sent to Gajendran and Harrison. However, in some cases necessary details were not available from these authors.

  5. A full listing of the studies included in each meta-analysis are available upon request.

  6. The term “weak support” is used to describe a situation when the confidence intervals for two estimated mean effects at different levels of a categorical moderator variable overlap one another, but only one set of confidence intervals includes 0. This situation implies that the effect of telecommuting only differs significantly from zero for one level of the moderator variable, but that in a stricter sense, the estimates for each level do not differ significantly from one another.

  7. We would like to thank an anonymous reviewer for pointing out that many meta-analysts are promoting the idea of using specialized personnel to retrieve studies from a domain (e.g., a reference librarian) and that this could serve as a possible alternative to representative sampling or other approaches to handling prohibitively large research literatures.

References

  • Aguinis, H., Dalton, D. A., Bosco, F. A., Pierce, C. A., & Dalton, C. M (2009, August). Meta-analytic choices and judgment calls: Implications for theory and scholarly impact. Paper presented at the meeting of the Academy of Management, Chicago, IL.

  • Allen, M., & Preiss, R. (1993). Replication and meta-analysis: A necessary connection. Journal of Social Behavior and Personality, 8, 9–20.

    Google Scholar 

  • Barrick, M. R., & Mount, M. K. (1991). The Big Five personality dimensions and job performance: A meta-analysis. Personnel Psychology, 44, 1–26.

    Article  Google Scholar 

  • Barrick, M. R., Mount, M. K., & Judge, T. A. (2001). Personality and performance at the beginning of the new millennium: What do we know and where do we go next? International Journal of Selection and Assessment, 9, 9–30.

    Article  Google Scholar 

  • Beal, D. J., Corey, D. M., & Dunlap, W. P. (2002). On the bias of Huffcutt and Arthur’s (1995) procedure for identifying outliers in the meta-analysis of correlations. Journal of Applied Psychology, 87, 583–589.

    Article  PubMed  Google Scholar 

  • Beaman, A. L. (1991). An empirical comparison of meta-analytic and traditional reviews. Personality and Social Psychology Bulletin. Special Issue: Meta-Analysis in Personality and Social Psychology, 17, 252–257.

    Google Scholar 

  • Bobko, P., & Roth, P. L. (2008). Psychometric accuracy and (the continuing need for) quality thinking in meta-analysis. Organizational Research Methods, 11, 114–126.

    Article  Google Scholar 

  • Briggs, D. C. (2005). Meta-analysis: A case study. Evaluation Review, 29, 87–127.

    Article  PubMed  Google Scholar 

  • Bullock, R. J., & Svyantek, D. J. (1985). Analyzing meta-analysis: Potential problems, an unsuccessful replication, and evaluation criteria. Journal of Applied Psychology, 70, 108–115.

    Article  Google Scholar 

  • Burke, M. J., & Landis, R. S. (2003). Methodological and conceptual challenges in conducting and interpreting meta-analyses. In K. R. Murphy (Ed.), Validity generalization: A critical review (pp. 287–310). Mahwah, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Campion, M. A. (1993). Article review checklist: A criterion checklist for reviewing research articles in applied psychology. Personnel Psychology, 46, 705–718.

    Article  Google Scholar 

  • Cortina, J. M. (2002). Big things have small beginnings: An assortment of “minor” methodological misunderstandings. Journal of Management, 28, 339–362.

    Google Scholar 

  • Cortina, J. M. (2003). Apples and oranges (and pears, oh my!): The search for moderators in meta-analysis. Organizational Research Methods, 6, 415–439.

    Article  Google Scholar 

  • Cortina, J. M., & Dunlap, W. P. (1997). On the logic and purpose of significance testing. Journal of Applied Psychology, 2, 161–172.

    Google Scholar 

  • Cree, L. H. (1999). Work/family balance of telecommuters. Dissertation Abstracts International Section B: The Sciences and Engineering, 59(11-B), 6100.

    Google Scholar 

  • Dieckmann, N. F., Malle, B. F., & Bodner, T. E. (2009). An empirical assessment of meta-analytic practice. Review of General Psychology, 13, 101–115.

    Article  Google Scholar 

  • Eden, D. (2002). Replication, meta-analysis, scientific progress, and AMJ’s publication policy. Academy of Management Journal, 45, 841–846.

    Google Scholar 

  • Egger, M., & Smith, G. D. (1998). Meta-analysis bias in location and selection of studies. British Medical Journal, 316, 61–66.

    PubMed  Google Scholar 

  • Field, A. P. (2001). Meta-analysis of correlation coefficients: A Monte Carlo comparison of fixed- and random-effects methods. Psychological Methods, 6, 161–180.

    Article  PubMed  Google Scholar 

  • Gajendran, R. S., & Harrison, D. A. (2007). The good, the bad, and the unknown about telecommuting: Meta-analysis of psychological mediators and individual consequences. Journal of Applied Psychology, 92, 1524–1541.

    Article  PubMed  Google Scholar 

  • Geyskens, I., Krishnan, R., Steenkamp, J. B. E. M., & Cunha, P. V. (2009). A review and evaluation of meta-analysis practices in management research. Journal of Management, 35, 393–419.

    Article  Google Scholar 

  • Greenhouse, J. B., & Iyengar, S. (1994). Sensitivity analysis and diagnostics. In H. M. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 503–520). New York: Russell Sage Foundation.

    Google Scholar 

  • Greenhalgh, T., & Peacock, R. (2005). Effectiveness and efficiency of search methods in systematic reviews of complex evidence: Audit of primary sources. British Medical Journal, 331, 1064–1065.

    Article  PubMed  Google Scholar 

  • Hackett, R. D., & Guion, R. M. (1985). A re-evaluation of the absenteeism-job satisfaction relationship. Organizational Behavior and Human Decision Processes, 35, 340–381.

    Article  PubMed  Google Scholar 

  • Hackman, J. R., & Oldham, G. R. (1975). Development of the job diagnostic survey. Journal of Applied Psychology, 60, 159–170.

    Article  Google Scholar 

  • Hale, J., & Dillard, J. (1991). The uses of meta-analysis: Making knowledge claims and setting research agendas. Communication Monographs, 58, 463–471.

    Article  Google Scholar 

  • Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.

    Google Scholar 

  • Hunter, J. E., & Schmidt, F. L. (2000). Fixed effects vs. random effects meta-analysis models: Implications for cumulative research knowledge. International Journal of Selection and Assessment, 8, 275–292.

    Article  Google Scholar 

  • Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings (2nd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Kisamore, J., & Brannick, M. (2008). An illustration of the consequences of meta-analysis model choice. Organizational Research Methods, 11, 35–53.

    Article  Google Scholar 

  • Konstantopoulos, S., & Hedges, L. V. (2009). Analyzing effect sizes: Fixed-effects models. In H. M. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 279–293). New York: Russell Sage Foundation.

    Google Scholar 

  • Kromrey, J. D., & Rendina-Gobioff, G. (2006). On knowing what we do not know: An empirical comparison of methods to detect publication bias in meta-analysis. Educational and Psychological Measurement, 66, 357–373.

    Article  Google Scholar 

  • Miner, J. B., & Raju, N. S. (2004). Risk propensity differences between managers and entrepreneurs and between low- and high-growth entrepreneurs: A reply in a more conservative vein. Journal of Applied Psychology, 89, 3–13.

    Article  PubMed  Google Scholar 

  • Nicklin, J. M., Mayfield, C. O., Caputo, P. M., Arboleda, M. A., Cosentino, R. E., Lee, M., et al. (2009). Does telecommuting increase organizational attitudes and outcomes: A meta-analysis. Pravara Management Review, 8, 2–16.

    Google Scholar 

  • Nieminen, L. R. G., Chakrabarti, M., McClure, T. K., & Baltes, B. B. (2008). A meta-analysis of the effects of telecommuting on employee outcomes. Paper presented at the 23rd Annual Conference of the Society for Industrial and Organizational Psychology, San Francisco, CA.

  • Ones, D. S., Mount, M. K., Barrick, M. R., & Hunter, J. E. (1994). Personality and job performance: A critique of the Tett, Jackson, and Rothstein (1991) meta-analysis. Personnel Psychology, 47, 147–156.

    Article  Google Scholar 

  • Overton, R. C. (1998). A comparison of fixed-effects and mixed (random-effects) models for meta-analysis tests of moderator variable effects. Psychological Methods, 3, 354–379.

    Article  Google Scholar 

  • Oyer, E. J. (1997). Validity and impact of meta-analyses in early intervention research. Dissertation Abstracts International Section A: Humanities and Social Sciences. 57(7-A), 2859.

  • Raghuram, S., & Weisenfeld, B. (2004). Work-nonwork conflict and job stress among virtual workers. Human Resource Management, 43, 259–277.

    Article  Google Scholar 

  • Rothstein, H. R., & McDaniel, M. A. (1989). Guidelines for conducting and reporting meta-analyses. Psychological Reports, 65, 759–770.

    Google Scholar 

  • Rothstein, H., Sutton, A. J., & Bornstein, M. (Eds.). (2005). Publication bias in meta-analysis: Prevention, assessment, and adjustments. Chichester, UK: Wiley.

    Google Scholar 

  • Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55, 68–78.

    Article  PubMed  Google Scholar 

  • Schmidt, F. L., & Hunter, J. E. (1999). Comparison of three meta-analysis methods revisited: An analysis of Johnson, Mullen, and Salas (1995). Journal of Applied Psychology, 84, 144–148.

    Article  Google Scholar 

  • Schulze, R. (2004). Meta-analysis: A comparison of approaches. Cambridge: Hogrefe & Huber.

    Google Scholar 

  • Schulze, R. (2007). Current methods for meta-analysis: Approaches, issues, and developments. Journal of Psychology. Special Issue: The State and the Art of Meta-Analysis, 215(2), 90–103.

    Google Scholar 

  • Scott, K. D., & Taylor, D. S. (1985). An examination of conflicting findings on the relationship between job satisfaction and absenteeism: A meta-analysis. Academy of Management Journal, 28, 599–612.

    Article  Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton-Mifflin.

    Google Scholar 

  • Staples, D. S. (2001). A study of remote workers and their differences from non-remote workers. Journal of End User Computing, 13, 3–14.

    Google Scholar 

  • Stewart, W. H., Jr., & Roth, P. L. (2001). Risk propensity differences between entrepreneurs and managers: A meta-analytic review. Journal of Applied Psychology, 86, 145–153.

    Article  PubMed  Google Scholar 

  • Stewart, W. H., Jr., & Roth, P. L. (2004). Data quality affects meta-analytic conclusions: A response to Miner and Raju (2004) concerning entrepreneurial risk propensity. Journal of Applied Psychology, 89, 14–21.

    Article  PubMed  Google Scholar 

  • Tett, R. P., Jackson, D. N., & Rothstein, M. (1991). Personality measures as predictors of job performance: A meta-analytic review. Personnel Psychology, 44, 703–742.

    Article  Google Scholar 

  • Wanous, J. P., Sullivan, S. E., & Malinak, J. (1989). The role of judgment calls in meta-analysis. Journal of Applied Psychology, 74, 259–264.

    Article  Google Scholar 

  • Wells, K., & Littell, J. H. (2009). Study quality assessment in systematic reviews of research on intervention effects. Research on Social Work Practice, 19, 52–62.

    Article  Google Scholar 

Download references

Acknowledgment

We would like to thank Boris Baltes and Christopher Berry for their constructive comments on a previous draft of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Levi R. G. Nieminen.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nieminen, L.R.G., Nicklin, J.M., McClure, T.K. et al. Meta-analytic Decisions and Reliability: A Serendipitous Case of Three Independent Telecommuting Meta-analyses. J Bus Psychol 26, 105–121 (2011). https://doi.org/10.1007/s10869-010-9185-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10869-010-9185-2

Keywords

Navigation