Copas-Jackson-type bounds for publication bias over a general class of selection models (2508.17716v1)
Abstract: Publication bias (PB) is one of the most vital threats to the accuracy of meta-analysis. Adjustment or sensitivity analysis based on selection models, which describe the probability of a study being published, provide a more objective evaluation of PB than widely-used simple graphical methods such as the trim-and-fill method. Most existing methods rely on parametric selection models. The Copas-Jackson bound (C-J bound) provides a worst-case bound of an analytical form over a nonparametric class of selection models, which would provide more robust conclusions than parametric sensitivity analysis. The nonparametric class of the selection models in the C-J bound is restrictive and only covers parametric selection models monotonic to the standard errors of outcomes. The novelty of this paper is to develop a method that constructs worst-case bounds over a general class of selection models weakening the assumption in the C-J bound. We propose an efficient numerical method to obtain an approximate worst-case bound via tractable nonlinear programming with linear constraints. We substantiate the effectiveness of the proposed bound with extensive simulation studies and show its applicability with two real-world meta-analyses.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.