Statistical frontiers for selective reporting and publication bias


Date
2021-06-23 12:30 PM
Location
online

This workshop will cover methods to investigate selective reporting in meta-analysis of statistically dependent effect sizes, which are a common feature of systematic reviews in psychology. The workshop is organized into two sections. In the first section, we will describe situations where dependent effect sizes occur and review methods for summarizing findings in the presence of dependent effects. We will then describe methods for creating and interpreting funnel plots, including tests of asymmetry, with dependent effect sizes. In the second section, we will present new statistical sensitivity analyses for publication bias, which perform well in small meta-analyses, those with non-normal or dependent effect sizes, and those with heterogeneity. The sensitivity analyses enable statements such as “For publication bias to shift the observed point estimate to the null, ‘significant’ results would need to be at least 10-fold more likely to be published than negative or ‘non-significant’ results” or “no amount of publication bias could explain away the average effect.” In both sections, we will demonstrate methods using R code and examples from real meta-analyses.