Controlling for Publication Bias: Challenges & Future Directions

Panel Discussion at the 2023 ESMARConf

Abstract

The goal of systematic reviews and meta-analyses is to provide a comprehensive, unbiased synthesis of the available evidence in a research field. This aim is seriously threatened if we have reasons to believe that some results are systematically missing, underrepresented or distorted in the published literature. Controlling adequately for such publication biases in meta-analyses remains challenging. Various methods are available, which differ in their assumptions concerning why publication bias arises, as well as how it manifests itself. Bringing together highly experienced field experts, the goal of this panel discussion is to highlight current state-of-the-art methods to control for publication bias, and their implementations in R. We also want to shed light on how evidence synthesists may navigate the great variety of approaches and implementations, and if some may be preferable to others. Finally, we aim to explore open research questions and future directions in the development of methods to adjust for publication bias.

Date
20. Januar 2023 13:00



Interview with Maya Mathur (Stanford University)


How is publication bias typically conceptualized? Do some methods differ in how they assume publication bias manifests itself?
What methods to adjust for publication bias are there in R? Which approaches can you recommend to (novice and/or experienced) meta-analysts?

Which of these methods typically performs best, or might be best suited for which specific context?
What limitations do you see with current approaches? Are there open research questions?

How bad do you believe publication bias to be? Do you believe that we have phenomena in science where effect sizes are “simply” inflated?


Further Reading


Mathur, M. B. (2022, June 1). Sensitivity analysis for $p$-hacking in meta-analyses. https://doi.org/10.31219/osf.io/ezjsx

Mathur, M. B. (2022, August 22). Sensitivity analysis for the interactive effects of internal bias and publication bias in meta-analyses. https://doi.org/10.31219/osf.io/ezjsx

Maier, M., VanderWeele, T.J., Mathur, M.B. (2022). Using selection models to assess sensitivity to publication bias: A tutorial and call for more routine use. Campbell Systematic Reviews. https://doi.org/10.1002/cl2.1256

Mathur, M. B., & VanderWeele, T. J. (2020). Sensitivity analysis for publication bias in meta-analyses. Journal of the Royal Statistical Society. Series C, Applied Statistics, 69(5), 1091. https://doi.org/10.1111/rssc.12440

Bartoš, F., Maier, M., Wagenmakers, E. J., Doucouliagos, H., & Stanley, T. D. (2021). Robust Bayesian meta-analysis: Model-averaging across complementary publication bias adjustment methods. Research Synthesis Methods. https://doi.org/10.1002/jrsm.1594

Bartoš, F., & Schimmack, U. (2022). $Z$-curve 2.0: Estimating replication rates and discovery rates. Meta-Psychology, 6. https://doi.org/10.15626/MP.2021.2720

Bartoš, F., Gronau, Q. F., Timmers, B., Otte, W. M., Ly, A., & Wagenmakers, E. J. (2021). Bayesian model-averaged meta-analysis in medicine. Statistics in Medicine. https://doi.org/10.1002/sim.9170

Bartoš, F., Maier, M., Wagenmakers, E.-J., Nippold, F., Doucouliagos, H., Ioannidis, J. P. A., Otte, W. M., Sladekova, M., Fanelli, D., & Stanley, T. D. (2022). Footprint of publication selection bias on meta-analyses in medicine, economics, and psychology. https://doi.org/10.48550/arXiv.2208.12334

Page, M.J., Sterne, J.A.C., Boutron, I., Hróbjartsson, A., …, Higgins, J.P.T. (2020). Risk Of Bias due to Missing Evidence (ROB-ME): a new tool for assessing risk of non-reporting biases in evidence syntheses (Version 24 October 2020) https://sites.google.com/site/riskofbiastool//welcome/rob-me-tool.

Page, M.J., Sterne, J.A.C., Higgins, J.P.T., Egger, M. (2021). Investigating and dealing with publication bias and other reporting biases in meta-analyses of health research: a review. Research Synthesis Methods, 12(2):248-259. https://doi.org/10.1002/jrsm.1468

van Aert, R. C. M. & van Assen, M. A. L. M. (2022). Correcting for publication bias in a meta-analysis with the $p$-uniform* method. https://doi.org/10.31222/osf.io/zqjr9

van Aert, R. C. M., Wicherts, J. M., & van Assen, M. A. L. M. (2016). Conducting meta-analyses on $p$-values: Reservations and recommendations for applying $p$-uniform and $p$-curve. Perspectives on Psychological Science, 11(5), 713-729. https://doi.org/10.1177/1745691616650874


Selected Functions & Packages in R


Braginsky M, Mathur M (2023). phacking: Sensitivity Analysis for p-Hacking in Meta-Analyses. R package version 0.1.0, https://CRAN.R-project.org/package=phacking.

Braginsky M, Mathur M (2023). multibiasmeta: Sensitivity Analysis for Multiple Biases in Meta-Analyses. R package version 0.1.0, https://CRAN.R-project.org/package=multibiasmeta.

Braginsky M, Mathur M, VanderWeele T (2023). PublicationBias: Sensitivity Analysis for Publication Bias in Meta-Analyses. R package version 2.3.0, https://CRAN.R-project.org/package=PublicationBias.

Bartoš F, Maier M (2020). “RoBMA: An R Package for Robust Bayesian Meta-Analyses.” R package version 2.3.1, https://CRAN.R-project.org/package=RoBMA.

van Aert RC (2022). puniform: Meta-Analysis Methods Correcting for Publication Bias. R package version 0.2.5, https://CRAN.R-project.org/package=puniform.

Viechtbauer W (2023). selmodel: Selection Models. https://wviechtb.github.io/metafor/reference/selmodel.html


Mathias Harrer, MSc
Mathias Harrer, MSc
Psychology & Digital Mental Health Care