Log in

Assessing Publication Bias: a 7-Step User’s Guide with Best-Practice Recommendations

  • Original Paper
  • Published:
Journal of Business and Psychology Aims and scope Submit manuscript

Abstract

Meta-analytic reviews are a primary avenue for the generation of cumulative knowledge in the organizational and psychological sciences. Over the past decade or two, concern has been raised about the possibility of publication bias influencing meta-analytic results, which can distort our cumulative knowledge and lead to erroneous practical recommendations. Unfortunately, no clear guidelines exist for how meta-analysts ought to assess this bias. To address this issue, this paper develops a user’s guide with best-practice recommendations for the assessment of publication bias in meta-analytic reviews. To do this, we review the literature on publication bias and develop a step-by-step process to assess the presence of publication bias and gage its effects on meta-analytic results. Examples of tools and best practices are provided to aid meta-analysts when implementing the process in their own research. Although the paper is written primarily for organizational and psychological scientists, the guide and recommendations are not limited to any particular scientific domain.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. We note that a more accurate term to describe the type of bias we are addressing in this paper is dissemination bias. This term subsumes publication bias, outcome reporting bias, time-lag bias, language bias, gray literature bias, and citation bias (Higgins et al., 2011; Song et al., 2013, 2010). However, the term publication bias is typically used in our literature to encompass all of these biases. This may be due to the fact that these other biases can cause publication bias, especially if the literature search is not conducted in a systematic and thorough fashion. Furthermore, the methods to assess bias cannot distinguish between the different types of biases. Therefore, aligned with convention in our literature, we use the established term publication bias throughout our paper.

  2. As an example, when common method variance is present, published effect sizes may be systematically biased in one direction at the primary study level, which will lead to biased meta-analytic results. This is not the fault of the meta-analyst or the statistical technique; the naïve meta-analytic mean may still be an unbiased estimate of the effect sizes from the available studies. However, because the publicly available studies are not a representative sample of all studies on the relevant topic, the resulting naïve mean is also not representative of all effect sizes from primary studies on a particular topic.

  3. We note that, instead of precision, the standard error (SE) is often used in the medical sciences. Funnel plots in applied psychology and related fields often depict precision (1/SE; e.g., Kepes et al., 2012) while the medical sciences use SE. This may be related to the different types of effect sizes these fields typically use (e.g., correlations and standardized mean differences in the social sciences versus odds and risk rations in the medical sciences). Similar plots using SE or other statistics instead of precision can be found in the literature (e.g., Sterne & Egger, 2005; Sterne et al., 2011). In any case, SE and precision provide the same information but do so on a slightly different scale.

  4. Fisher’s z tends to be the default in most software packages when using correlational data. However, typically, other statistics (e.g., r, d, odds ratios) can also be displayed.

  5. We want to thank two anonymous reviewers for suggesting that we address this important issue.

References

  • Aguinis, H., Gottfredson, R. K., & Joo, H. (2013). Best-practice recommendations for defining, identifying, and handling outliers. Organizational Research Methods, 16, 270–301. https://doi.org/10.1177/1094428112470848

    Article  Google Scholar 

  • American Psychological Association. (2022). Journal coverage information for publishers. Retrieved August 4, 2022, from https://www.apa.org/pubs/databases/psycinfo/publishers/journals

  • Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73, 3–25. https://doi.org/10.1037/amp0000191

    Article  PubMed  Google Scholar 

  • Bachrach, D. G., Lewis, K., Kim, Y., Patel, P. C., Campion, M. C., & Thatcher, S. (2019). Transactive memory systems in context: A meta-analytic examination of contextual factors in transactive memory systems development and team performance. Journal of Applied Psychology, 104, 464.

    Article  PubMed  Google Scholar 

  • Banks, G. C., Kepes, S., & McDaniel, M. A. (2012). Publication bias: A call for improved meta-analytic practice in the organizational sciences. International Journal of Selection and Assessment, 20, 182–196. https://doi.org/10.1111/j.1468-2389.2012.00591.x

    Article  Google Scholar 

  • Banks, G. C., Kepes, S., & McDaniel, M. A. (2015). Publication bias: Understanding the myths concerning threats to the advancement of science. In C. E. Lance & R. J. Vandenberg (Eds.), More statistical and methodological myths and urban legends (pp. 36–64). Routledge.

    Google Scholar 

  • Banks, G. C., Woznyj, H. M., Kepes, S., Batchelor, J. H., & McDaniel, M. A. (2018). A meta-analytic review of tip** compensation practices: An agency theory perspective. Personnel Psychology, 71, 457–478. https://doi.org/10.1111/peps.12261

    Article  Google Scholar 

  • Banks, G. C., Field, J. G., Oswald, F. L., O’Boyle, E. H., Landis, R. S., Rupp, D. E., & Rogelberg, S. G. (2019). Answers to 18 questions about open science practices. Journal of Business and Psychology, 34, 257–270. https://doi.org/10.1007/s10869-018-9547-8

    Article  Google Scholar 

  • Becker, B. J. (2005). The failsafe N or file-drawer number. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment, and adjustments (pp. 111–126). Wiley.

    Chapter  Google Scholar 

  • Begg, C. B., & Mazumdar, M. (1994). Operating characteristics of a rank correlation test for publication bias. Biometrics, 50, 1088–1101. https://doi.org/10.2307/2533446

    Article  PubMed  Google Scholar 

  • Benjamin, A. J., Kepes, S., & Bushman, B. J. (2018). Effects of weapons on aggressive thoughts, angry feelings, hostile appraisals, and aggressive behavior: A meta-analytic review of the weapons effect literature. Personality and Social Psychology Review, 22, 347–377. https://doi.org/10.1177/1088868317725419

    Article  PubMed  Google Scholar 

  • Borenstein, M. (2019). Heterogeneity in meta-analysis. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (3rd ed., pp. 453–470). Russel Sage Foundation.

    Chapter  Google Scholar 

  • Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2009). Introduction to meta-analysis. Wiley.

    Book  Google Scholar 

  • Bosco, F. A., Aguinis, H., Singh, K., Field, J. G., & Pierce, C. A. (2015). Correlational effect size benchmarks. Journal of Applied Psychology, 100, 431–449. https://doi.org/10.1037/a0038047

    Article  PubMed  Google Scholar 

  • Boudreau, J. W. (1991). Utility analysis for decisions in human resource management. In Handbook of industrial and organizational psychology (2nd ed., Vol. 2, pp. 621–745). Consulting Psychologists Press.

  • Burnay, J., Kepes, S., & Bushman, B. J. (2022). Effects of violent and nonviolent sexualized media on aggression-related thoughts, feelings, attitudes, and behaviors: A meta-analytic review. Aggressive Behavior, 48, 111–136. https://doi.org/10.1002/ab.21998

    Article  PubMed  Google Scholar 

  • Carlson, K. D., & Ji, F. X. (2011). Citing and building on meta-analytic findings: A review and recommendations. Organizational Research Methods, 14, 696–717. https://doi.org/10.1177/1094428110384272

    Article  Google Scholar 

  • Carter, E. C., Schönbrodt, F. D., Gervais, W. M., & Hilgard, J. (2019). Correcting for bias in psychology: A comparison of meta-analytic methods. Advances in Methods and Practices in Psychological Science, 2, 115–144. https://doi.org/10.1177/2515245919847196

    Article  Google Scholar 

  • Cascio, W. F. (2000). Costing human resources: The financial impact of behavior in organizations. South-Western.

    Google Scholar 

  • Cleveland, W. S., & McGill, R. (1985). Graphical perception and graphical methods for analyzing scientific data. Science, 229, 828–833. https://doi.org/10.1126/science.229.4716.828

    Article  PubMed  Google Scholar 

  • Coburn, K. M., & Vevea, J. L. (2015). Publication bias as a function of study characteristics. Psychological Methods, 20, 310–330. https://doi.org/10.1037/met0000046

    Article  PubMed  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Erlbaum.

    Google Scholar 

  • Copas, J., & Shi, J. Q. (2000). Meta-analysis, funnel plots and sensitivity analysis. Biostatistics, 1, 247–262. https://doi.org/10.1093/biostatistics/1.3.247

    Article  PubMed  Google Scholar 

  • Cortina, J. M., Koehler, T., Keeler, K. R., & Nielsen, B. B. (2019). Restricted variance interaction effects: What they are and why they are your friends. Journal of Management, 45, 2779–2806. https://doi.org/10.1177/0149206318770735

    Article  Google Scholar 

  • De Angelis, C., Drazen, J. M., Frizelle, F. A., Haug, C., Hoey, J., Horton, R., … & Weyden, M. B. V. D. (2004). Clinical trial registration: A statement from the International Committee of Medical Journal Editors. New England Journal of Medicine, 351, 1250–1251. https://doi.org/10.1056/NEJMe048225

  • Dear, K. B. G., & Begg, C. B. (1992). An approach for assessing publication bias prior to performing a meta-analysis. Statistical Science, 7, 237–245. https://doi.org/10.1214/ss/1177011363

    Article  Google Scholar 

  • Dechartres, A., Ravaud, P., Atal, I., Riveros, C., & Boutron, I. (2016). Association between trial registration and treatment effect estimates: A meta-epidemiological study. BMC Medicine, 14, 100. https://doi.org/10.1186/s12916-016-0639-x

    Article  PubMed  PubMed Central  Google Scholar 

  • Derzon, J. H., & Alford, A. A. (2013). Forest plots in Excel: Moving beyond a clump of trees to a forest of visual information. Practical Assessment, Research, and Evaluation, 18, 1–9. https://doi.org/10.7275/96vm-5c74

    Article  Google Scholar 

  • DeSimone, J. A., Köhler, T., & Schoen, J. L. (2019). If it were only that easy: The use of meta-analytic research by organizational scholars. Organizational Research Methods, 22, 867–891. https://doi.org/10.1177/1094428118756743

    Article  Google Scholar 

  • Dickersin, K., & Rennie, D. (2012). The evolution of trial registries and their use to assess the clinical trial enterprise. Journal of the American Medical Association, 307, 1861–1864. https://doi.org/10.1001/jama.2012.4230

    Article  PubMed  Google Scholar 

  • Duval, S. J. (2005). The “trim and fill” method. In H. R. Rothstein, A. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment, and adjustments (pp. 127–144). Wiley.

    Chapter  Google Scholar 

  • Duval, S. J., & Tweedie, R. L. (2000). Trim and fill: A simple funnel-plot based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56, 455–463.

    Article  PubMed  Google Scholar 

  • Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315, 629–634. https://doi.org/10.1136/bmj.315.7109.629

    Article  PubMed  PubMed Central  Google Scholar 

  • Egger, M., Smith, G. D., & Altman, D. (2001). Systematic reviews in health care: Meta-analysis in context. BMJ Books.

    Book  Google Scholar 

  • Ellison, A. M. (2001). Exploratory data analysis and graphic display. In S. M. Scheiner & J. Gurevitch (Eds.), Design and analysis of ecological experiments. Oxford University Press.

    Google Scholar 

  • Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63, 665–694. https://doi.org/10.1348/000711010X502733

    Article  PubMed  Google Scholar 

  • Field, J. G., Bosco, F. A., & Kepes, S. (2021). How robust is our cumulative knowledge on turnover? Journal of Business and Psychology, 36, 349–365. https://doi.org/10.1007/s10869-020-09687-3

    Article  Google Scholar 

  • Fletcher, J. (2007). What is heterogeneity and is it important? British Medical Journal, 334, 94–96. https://doi.org/10.1136/bmj.39057.406644.68

    Article  PubMed  PubMed Central  Google Scholar 

  • Giolla, E. M., Karlsson, S., Neequaye, D. A., & Bergquist, M. (2022). Evaluating the replicability of social priming studies. Gothenburg, Sweden: University of Gothenburg. https://doi.org/10.31234/osf.io/dwg9v

  • Giustini, D. (2019). Retrieving grey literature, information, and data in the digital age. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (pp. 101–126). Russell Sage Foundation.

    Chapter  Google Scholar 

  • Greenwald, A. G. (1975). Consequences of prejudice against the null hypothesis. Psychological Bulletin, 82, 1–20. https://doi.org/10.1037/h0076157

    Article  Google Scholar 

  • Hancock, J. I., Allen, D. G., Bosco, F. A., McDaniel, K. R., & Pierce, C. A. (2013). Meta-analytic review of employee turnover as a predictor of firm performance. Journal of Management, 39, 573–603. https://doi.org/10.1177/0149206311424943

    Article  Google Scholar 

  • Harrison, J. S., Banks, G. C., Pollack, J. M., O’Boyle, E. H., & Short, J. (2017). Publication bias in strategic management research. Journal of Management, 43, 400–425. https://doi.org/10.1177/0149206314535438

    Article  Google Scholar 

  • Hedges, L. V., & Vevea, J. L. (1996). Estimating effect size under publication bias: Small sample properties and robustness of a random effects selection model. Journal of Educational and Behavioral Statistics, 21, 299–332. https://doi.org/10.2307/1165338

    Article  Google Scholar 

  • Hedges, L. V., & Vevea, J. L. (2005). Selection methods approaches. In H. R. Rothstein, A. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment, and adjustments (pp. 145–174). Wiley.

    Chapter  Google Scholar 

  • Hedges, L. V. (1992). Modeling publication selection effects in meta-analysis. Statistical Science, 246–255.

  • Henrich, J. (2020). The WEIRDest people in the world: How the West became psychologically peculiar and particularly prosperous. Farrar, Straus and Giroux.

    Google Scholar 

  • Higgins, J. P. T., Thompson, S. G., Deeks, J. J., & Altman, D. G. (2003). Measuring inconsistency in meta-analyses. British Medican Journal, 327, 557–560. https://doi.org/10.1136/bmj.327.7414.557

    Article  Google Scholar 

  • Higgins, J. P. T., Altman, D. G., Gøtzsche, P. C., Jüni, P., Moher, D., Oxman, A. D., … & Sterne, J. A. C. (2011). The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. BMJ, 343, d5928. https://doi.org/10.1136/bmj.d5928

  • Higgins, J. P., & Green, S. (Eds.). (2009). Cochrane handbook for systematic reviews of interventions; Version 5.0.2 [updated September 2009]: The Cochrane Collaboration. Available from www.cochrane-handbook.org

  • Higgins, J. P., Thomas, J., Chandler, J., Cumpston, M., Li, T., Page, M. J., & Welch, V. A. (Eds.). (2021). Cochrane handbook for systematic reviews of interventions; Version 6.2: The Cochrane Collaboration. Available from https://training.cochrane.org/handbook

  • Hofstede, G. (2001). Culture’s consequences: Comparing values, behaviors, institutions and organizations across nations. Sage.

    Google Scholar 

  • Hopewell, S., Clarke, M., & Mallett, S. (2005). Grey literature and systematic reviews. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis (pp. 49–72). Wiley.

    Chapter  Google Scholar 

  • Huffcutt, A. I., & Arthur, W. (1995). Development of a new outlier statistic for meta-analytic data. Journal of Applied Psychology, 80, 327–334. https://doi.org/10.1037/0021-9010.1080.1032.1327

    Article  Google Scholar 

  • Hunter, J. E., & Schmidt, F. L. (1990). Methods of meta-analysis: Correcting error and bias in research findings. Sage.

    Google Scholar 

  • Hurtz, G. M., & Donovan, J. J. (2000). Personality and job performance: The Big Five revisited. Journal of Applied Psychology, 85, 869–879. https://doi.org/10.1037/0021-9010.85.6.869

    Article  PubMed  Google Scholar 

  • Ioannidis, J. P., & Trikalinos, T. A. (2007). An exploratory test for an excess of significant findings. Clinical Trials, 4, 245–253. https://doi.org/10.1177/1740774507079441

    Article  PubMed  Google Scholar 

  • Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly, 24, 602–611. https://doi.org/10.2307/2392366

    Article  Google Scholar 

  • Kepes, S., & McDaniel, M. A. (2013). How trustworthy is the scientific literature in industrial and organizational psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice, 6, 252–268. https://doi.org/10.1111/iops.12045

    Article  Google Scholar 

  • Kepes, S., & McDaniel, M. A. (2015). The validity of conscientiousness is overestimated in the prediction of job performance. PLoS ONE, 10, e0141468. https://doi.org/10.1371/journal.pone.0141468

    Article  PubMed  PubMed Central  Google Scholar 

  • Kepes, S., & Thomas, M. A. (2018). Assessing the robustness of meta-analytic results in information systems: Publication bias and outliers. European Journal of Information Systems, 27, 90–123. https://doi.org/10.1080/0960085X.2017.1390188

    Article  Google Scholar 

  • Kepes, S., Banks, G. C., McDaniel, M. A., & Whetzel, D. L. (2012). Publication bias in the organizational sciences. Organizational Research Methods, 15, 624–662. https://doi.org/10.1177/1094428112452760

    Article  Google Scholar 

  • Kepes, S., McDaniel, M. A., Brannick, M. T., & Banks, G. C. (2013). Meta-analytic reviews in the organizational sciences: Two meta-analytic schools on the way to MARS (the Meta-analytic Reporting Standards). Journal of Business and Psychology, 28, 123–143. https://doi.org/10.1007/s10869-013-9300-2

    Article  Google Scholar 

  • Kepes, S., Banks, G. C., & Oh, I.-S. (2014a). Avoiding bias in publication bias research: The value of “null” findings. Journal of Business and Psychology, 29, 183–203. https://doi.org/10.1007/s10869-012-9279-0

    Article  Google Scholar 

  • Kepes, S., Bennett, A. A., & McDaniel, M. A. (2014b). Evidence-based management and the trustworthiness of our cumulative scientific knowledge: Implications for teaching, research, and practice. Academy of Management Learning & Education, 13, 446–466. https://doi.org/10.5465/amle.2013.0193

    Article  Google Scholar 

  • Kepes, S., Bushman, B. J., & Anderson, C. A. (2017). Violent video game effects remain a societal concern: Comment on Hilgard, Engelhardt, and Rouder (2017). Psychological Bulletin, 143, 775–782. https://doi.org/10.1037/bul000012

    Article  PubMed  Google Scholar 

  • Kepes, S., List, S. K., & McDaniel, M. A. (2018). Enough talk, it’s time to transform: A call for editorial leadership for a robust science. Industrial and Organizational Psychology, 11, 43–48. https://doi.org/10.1017/iop.2017.83

    Article  Google Scholar 

  • Kepes, S., Keener, S. K., McDaniel, M. A., & Hartman, N. S. (2022). Questionable research practices among researchers in the most research-productive management programs. Journal of Organizational Behavior. https://doi.org/10.1002/job.2623

    Article  Google Scholar 

  • Kisamore, J. L., & Brannick, M. T. (2008). An illustration of the consequences of meta-analysis model choice. Organizational Research Methods, 11, 35–53. https://doi.org/10.1177/1094428106287393

    Article  Google Scholar 

  • Koslowsky, M., & Sagie, A. (1993). On the efficacy of credibility intervals as indicators of moderator effects in meta-analytic research. Journal of Organizational Behavior, 14, 695–699. https://doi.org/10.1002/job.4030140708

    Article  Google Scholar 

  • Kulinskaya, E., & Koricheva, J. (2010). Use of quality control charts for detection of outliers and temporal trends in cumulative meta-analysis. Research Synthesis Methods, 1, 297–307. https://doi.org/10.1002/jrsm.29

    Article  PubMed  Google Scholar 

  • Laine, C., Horton, R., DeAngelis, C. D., Drazen, J. M., Frizelle, F. A., Godlee, F., … & Verheugt, F. W. A. (2007). Clinical trial registration—Looking back and moving ahead. New England Journal of Medicine, 356, 2734–2736. https://doi.org/10.1056/NEJMe078110

  • Latham, G. P., & Whyte, G. (1994). The futility of utility analysis. Personnel Psychology, 47, 31.

    Article  Google Scholar 

  • Lau, J., Antman, E. M., Jimenez-Silva, J., Kupelnick, B., Mosteller, F., & Chalmers, T. C. (1992). Cumulative meta-analysis of therapeutic trials for myocardial infarction. New England Journal of Medicine, 327, 248–254. https://doi.org/10.1056/nejm199207233270406

    Article  PubMed  Google Scholar 

  • Lau, J., Schmid, C. H., & Chalmers, T. C. (1995). Cumulative meta-analysis of clinical trials builds evidence for exemplary medical care. Journal of Clinical Epidemiology, 48, 45–57. https://doi.org/10.1016/0895-4356(94)00106-Z

    Article  PubMed  Google Scholar 

  • Li, G., Zeng, J., Tian, J., Levine, M. A. H., & Thabane, L. (2020). Multiple uses of forest plots in presenting analysis results in health research: A tutorial. Journal of Clinical Epidemiology, 117, 89–98. https://doi.org/10.1016/j.jclinepi.2019.09.021

    Article  PubMed  Google Scholar 

  • Light, R. J., & Pillemer, D. B. (1984). Summing up: The science of reviewing research. Harvard University Press.

    Book  Google Scholar 

  • Light, R. J., Singer, J. D., & Willett, J. B. (1994). The visual presentation and interpretation of meta-analyses. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 439–453). Russell Sage Foundation.

    Google Scholar 

  • Lindsley, K., Fusco, N., Li, T., Scholten, R., & Hooft, L. (2022). Clinical trial registration was associated with lower risk of bias compared with non-registered trials among trials included in systematic reviews. Journal of Clinical Epidemiology, 145, 164–173. https://doi.org/10.1016/j.jclinepi.2022.01.012

    Article  PubMed  PubMed Central  Google Scholar 

  • List, S. K., Kepes, S., McDaniel, M. A., & MacDaniel, X. (2018a). Assessing the trustworthiness of our cumulative knowledge in learning, behavior, and performance. Paper presented at the annual meeting of the Academy of Management, Chicago, IL.

  • List, S. K., MacDaniel, X., Kepes, S., & McDaniel, M. A. (2018b). Assessing the trustworthiness of our cumulative knowledge in psychology. Paper presented at the annual meeting of the Society for Industrial and Organizational Psychology, Chicago, IL.

  • Lortie, C. J., Lau, J., & Lajeunesse, M. J. (2013). Graphical presentation of results. In J. Koricheva, J. Gurevitch, & K. Mengersen (Eds.), Handbook of meta-analysis in ecology and evolution (pp. 339–347). Princeton University Press.

    Google Scholar 

  • Mackey, J. D., McAllister, C. P., Maher, L. P., & Wang, G. (2019). Leaders and followers behaving badly: A meta-analytic examination of curvilinear relationships between destructive leadership and followers’ workplace behaviors. Personnel Psychology, 72, 3–47.

    Article  Google Scholar 

  • Mavridis, D., Sutton, A., Cipriani, A., & Salanti, G. (2013). A fully Bayesian application of the Copas selection model for publication bias extended to network meta-analysis. Statistics in Medicine, 32, 51–66. https://doi.org/10.1002/sim.5494

    Article  PubMed  Google Scholar 

  • McGaw, B., & Glass, G. V. (1980). Choice of the metric for effect size in meta-analysis. American Educational Research Journal, 17, 325–337. https://doi.org/10.3102/00028312017003325

    Article  Google Scholar 

  • McShane, B. B., Böckenholt, U., & Hansen, K. T. (2016). Adjusting for publication bias in meta-analysis: An evaluation of selection methods and some cautionary notes. Perspectives on Psychological Science, 11, 730–749. https://doi.org/10.1177/1745691616662243

    Article  PubMed  Google Scholar 

  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & the, P. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine, 151, 264–269. https://doi.org/10.7326/0003-4819-151-4-200908180-00135

    Article  PubMed  Google Scholar 

  • Moreno, S. G., Sutton, A., Ades, A. E., Stanley, T. D., Abrams, K. R., Peters, J. L., & Cooper, N. J. (2009). Assessment of regression-based methods to adjust for publication bias through a comprehensive simulation study. BMC Medical Research Methodology, 9.

  • O’Boyle, E. H., Banks, G. C., & Gonzalez-Mulé, E. (2017). The chrysalis effect: How ugly initial results metamorphosize into beautiful articles. Journal of Management, 43, 376–399. https://doi.org/10.1177/0149206314527133

    Article  Google Scholar 

  • O’Boyle, E., Banks, G. C., Carter, K., Walter, S., & Yuan, Z. (2019). A 20-year review of outcome reporting bias in moderated multiple regression. Journal of Business and Psychology, 34, 19–37. https://doi.org/10.1007/s10869-018-9539-8

    Article  Google Scholar 

  • Orlitzky, M. (2012). How can significance tests be deinstitutionalized? Organizational Research Methods, 15, 199–228. https://doi.org/10.1177/1094428111428356

    Article  Google Scholar 

  • Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational Statistics, 8, 157–159. https://doi.org/10.3102/10769986008002157

    Article  Google Scholar 

  • Papageorgiou, S. N., Xavier, G. M., Cobourne, M. T., & Eliades, T. (2018). Registered trials report less beneficial treatment effects than unregistered ones: A meta-epidemiological study in orthodontics. Journal of Clinical Epidemiology, 100, 44–52. https://doi.org/10.1016/j.jclinepi.2018.04.017

    Article  PubMed  Google Scholar 

  • Peters, J. L., Sutton, A. J., Jones, D. R., Abrams, K. R., & Rushton, L. (2006). Comparison of two methods to detect publication bias in meta-analysis. Journal of the American Medical Association, 295, 676–680. https://doi.org/10.1001/jama.295.6.676

    Article  PubMed  Google Scholar 

  • Peters, J. L., Sutton, A. J., Jones, D. R., Abrams, K. R., & Rushton, L. (2008). Contour-enhanced meta-analysis funnel plots help distinguish publication bias from other causes of asymmetry. Journal of Clinical Epidemiology, 61, 991–996. https://doi.org/10.1016/j.jclinepi.2007.11.010

    Article  PubMed  Google Scholar 

  • Renkewitz, F., & Keiner, M. (2019). How to detect publication bias in psychological research: A comparative evaluation of six statistical methods. Zeitschrift Für Psychologie, 227, 261–279. https://doi.org/10.1027/2151-2604/a000386

    Article  Google Scholar 

  • Richard, F. D., Bond, C. F., Jr., & Stokes-Zoota, J. J. (2003). One hundred years of social psychology quantitatively described. Review of General Psychology, 7, 331–363. https://doi.org/10.1037/1089-2680.7.4.331

    Article  Google Scholar 

  • Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 638–641. https://doi.org/10.1037/0033-2909.86.3.638

    Article  Google Scholar 

  • Rothstein, H. (2012). Accessing relevant literature. Foundations, planning, measures, and psychometricsIn H. M. Cooper (Ed.), APA handbook of research methods in psychology (Vol. 1, pp. 133–144). American Psychological Association.

    Google Scholar 

  • Rothstein, H. R., & Hopewell, S. (2009). Grey literature. In H. M. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed.). Russell Sage Foundation.

    Google Scholar 

  • Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005a). Publication bias in meta-analyses. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment, and adjustments (pp. 1–7). Wiley.

    Chapter  Google Scholar 

  • Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005b). Publication bias in meta-analysis: Prevention, assessment, and adjustments. Wiley.

    Book  Google Scholar 

  • Rücker, G., Carpenter, J. R., & Schwarzer, G. (2011). Detecting and adjusting for small-study effects in meta-analysis. Biometrical Journal, 53, 351–368. https://doi.org/10.1002/bimj.201000151

    Article  PubMed  Google Scholar 

  • Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262–274. https://doi.org/10.1037/0033-2909.124.2.262

    Article  Google Scholar 

  • Schöpfel, J., & Farace, D. J. (2015). Grey literature. In M. J. Bates & M. N. Maack (Eds.), Encyclopedia of library and information sciences (3rd ed.). CRC Press.

    Google Scholar 

  • Schwarzer, G. (2022). meta: General package for meta-analysis (version 5.2–0).

  • Schwarzer, G., Carpenter, J. R., & Rücker, G. (2022). metasens: Statistical methods for sensitivity analysis in meta-analysis (version 1.0–1).

  • Shea, B. J., Reeves, B. C., Wells, G., Thuku, M., Hamel, C., Moran, J., … & Henry, D. A. (2017). AMSTAR 2: A critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. British Medical Journal, 358, j4008. https://doi.org/10.1136/bmj.j4008

  • Shewach, O. R., Sackett, P. R., & Quint, S. (2019). Stereotype threat effects in settings with features likely versus unlikely in operational test settings: A meta-analysis. Journal of Applied Psychology, 104, 1514.

    Article  PubMed  Google Scholar 

  • Siegel, M., Eder, J. S. N., Wicherts, J. M., & Pietschnig, J. (2022). Times are changing, bias isn’t: A meta-meta-analysis on publication bias detection practices, prevalence rates, and predictors in industrial/organizational psychology. Journal of Applied Psychology. https://doi.org/10.1037/apl0000991

    Article  PubMed  Google Scholar 

  • Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014a). P-curve and effect size: Correcting for publication bias using only significant results. Perspectives on Psychological Science, 9, 666–681. https://doi.org/10.1177/1745691614553988

    Article  PubMed  Google Scholar 

  • Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014b). P-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143, 534–547. https://doi.org/10.1037/a0033242

    Article  PubMed  Google Scholar 

  • Simonsohn, U., Simmons, J. P., & Nelson, L. D. (2015). Better p-curves: Making p-curve analysis more robust to errors, fraud, and ambitious p-hacking, a reply to Ulrich and Miller (2015). Journal of Experimental Psychology: General, 144, 1146–1152. https://doi.org/10.1037/xge0000104

    Article  PubMed  Google Scholar 

  • Smith, P. C., Kendall, L., & Hulin, C. L. (1969). The measurement of satisfaction in work and retirement: A strategy for the study of attitudes. Rand McNally.

    Google Scholar 

  • Song, F., Parekh, S., Hooper, L., Loke, Y., Ryder, J., Sutton, A. J., … & Harvey, I. (2010). Dissemination and publication of research findings: An updated review of related biases. Health Technology Assessment, 14, 1–220. https://doi.org/10.3310/hta14080

  • Song, F., Hooper, L., & Loke, Y. (2013). Publication bias: What is it? How do we measure it? How do we avoid it? Open Access Journal of Clinical Trials, 5, 71–81. https://doi.org/10.2147/OAJCT.S34419

    Article  Google Scholar 

  • Spector, P. E. (1985). Measurement of human service staff satisfaction: Development of the Job Satisfaction Survey. American Journal of Community Psychology, 13, 693–713. https://doi.org/10.1007/BF00929796

    Article  PubMed  Google Scholar 

  • Stanley, T. D., & Doucouliagos, H. (2012). Meta-regression analysis in economics and business. Routledge.

    Book  Google Scholar 

  • Stanley, T. D., & Doucouliagos, H. (2014). Meta-regression approximations to reduce publication selection bias. Research Synthesis Methods, 5, 60–78. https://doi.org/10.1002/jrsm.1095

    Article  PubMed  Google Scholar 

  • Stanley, T. D., & Doucouliagos, H. (2017). Neither fixed nor random: Weighted least squares meta-regression. Research Synthesis Methods, 8, 19–42. https://doi.org/10.1002/jrsm.1211

    Article  PubMed  Google Scholar 

  • Sterne, J. A., & Egger, M. (2005). Regression methods to detect publication bias and other bias in meta-analysis. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment, and adjustments (pp. 99–110). Wiley.

    Chapter  Google Scholar 

  • Sterne, J. A. C., Gavaghan, D., & Egger, M. (2000). Publication and related bias in meta-analysis: Power of statistical tests and prevalence in the literature. Journal of Clinical Epidemiology, 53, 1119–1129. https://doi.org/10.1016/S0895-4356(00)00242-0

    Article  PubMed  Google Scholar 

  • Sterne, J. A., Gavaghan, D., & Egger, M. (2005). The funnel plot. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment, and adjustments (pp. 75–98). Wiley.

    Google Scholar 

  • Sterne, J. A. C., Sutton, A. J., Ioannidis, J. P. A., Terrin, N., Jones, D. R., Lau, J., … & Higgins, J. P. T. (2011). Recommendations for examining and interpreting funnel plot asymmetry in meta-analyses of randomised controlled trials. British Medical Journal, 343, 302–307. https://doi.org/10.1136/bmj.d4002

  • Tay, L., Ng, V., Malik, A., Zhang, J., Chae, J., Ebert, D. S., … & Kern, M. (2018). Big data visualizations in organizational science. Organizational Research Methods, 21, 660–688. https://doi.org/10.1177/1094428117720014

  • Terrin, N., Schmid, C. H., Lau, J., & Olkin, I. (2003). Adjusting for publication bias in the presence of heterogeneity. Statistics in Medicine, 22, 2113–2126. https://doi.org/10.1002/sim.1461

    Article  PubMed  Google Scholar 

  • Trikalinos, T. A., & Ioannidis, J. P. A. (2005). Assessing the evolution of effect sizes over time. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta analysis: Prevention, assessment and adjustments (pp. 241–259). Wiley.

    Chapter  Google Scholar 

  • Tufte, E. R. (2001). The visual display of quantitative information (2nd ed.). Graphics Press.

    Google Scholar 

  • van Aert, R. C. M., Wicherts, J. M., & van Assen, M. A. L. M. (2016). Conducting meta-analyses based on p values: Reservations and recommendations for applying p-uniform and p-curve. Perspectives on Psychological Science, 11, 713–729. https://doi.org/10.1177/1745691616650874

    Article  PubMed  PubMed Central  Google Scholar 

  • van Assen, M. A. L. M., van Aert, R. C. M., & Wicherts, J. M. (2015). Meta-analysis using effect size distributions of only statistically significant studies. Psychological Methods, 20, 293–309. https://doi.org/10.1037/met0000025

    Article  PubMed  Google Scholar 

  • van Aert, R. C. M., & van Assen, M. A. L. M. (2018). Correcting for publication bias in a meta-analysis with the P-uniform* method. . MetaAr**v. https://doi.org/10.31222/osf.io/zqjr9

  • van Aert, R. C. M., & van Assen, M. A. L. M. (2019). Correcting for publication bias in a meta-analysis with the p-uniform* method: Tilburg University.

  • van Aert, R. C. M. (2022). puniform: Meta-analysis methods correcting for publication bias (version 0.2.5).

  • Vevea, J. L., & Hedges, L. V. (1995). A general linear model for estimating effect size in the presence of publication bias. Psychometrika, 60, 419–435. https://doi.org/10.1007/BF02294384

    Article  Google Scholar 

  • Vevea, J. L., & Woods, C. M. (2005). Publication bias in research synthesis: Sensitivity analysis using a priori weight functions. Psychological Methods, 10, 428–443. https://doi.org/10.1037/1082-989X.10.4.428

    Article  PubMed  Google Scholar 

  • Vevea, J. L., Clements, N. C., & Hedges, L. V. (1993). Assessing the effects of selection bias on validity data for the General Aptitude Test Battery. Journal of Applied Psychology, 78, 981–987. https://doi.org/10.1037/0021-9010.78.6.981

    Article  Google Scholar 

  • Vevea, J. L., Coburn, K., & Sutton, A. (2019). Publication bias. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (pp. 383–429). Russell Sage Foundation.

    Chapter  Google Scholar 

  • Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36, 1–48.

    Article  Google Scholar 

  • Viechtbauer, W., & Cheung, M. W. L. (2010). Outlier and influence diagnostics for meta-analysis. Research Synthesis Methods, 1, 112–125. https://doi.org/10.1002/jrsm.11

    Article  PubMed  Google Scholar 

  • Viechtbauer, W. (2022). metafor: Meta-analysis package for R (version 3.4–0).

  • Weinhandl, E. D., & Duval, S. (2012). Generalization of trim and fill for application in meta-regression. Research Synthesis Methods, 3, 51–67. https://doi.org/10.1002/jrsm.1042

    Article  PubMed  Google Scholar 

  • Whitener, E. M. (1990). Confusion of confidence intervals and credibility intervals in meta-analysis. Journal of Applied Psychology, 75, 315–321. https://doi.org/10.1037/0021-9010.75.3.315

    Article  Google Scholar 

  • Whiting, P., Savović, J., Higgins, J. P. T., Caldwell, D. M., Reeves, B. C., Shea, B., … & Churchill, R. (2016). ROBIS: A new tool to assess risk of bias in systematic reviews was developed. Journal of Clinical Epidemiology, 69, 225–234. https://doi.org/10.1016/j.jclinepi.2015.06.005

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sven Kepes.

Ethics declarations

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kepes, S., Wang, W. & Cortina, J.M. Assessing Publication Bias: a 7-Step User’s Guide with Best-Practice Recommendations. J Bus Psychol 38, 957–982 (2023). https://doi.org/10.1007/s10869-022-09840-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10869-022-09840-0

Keywords

Navigation