Optimization-centric cutting feedback for semiparametric models (2509.18708v1)
Abstract: Modern statistics deals with complex models from which the joint model used for inference is built by coupling submodels, called modules. We consider modular inference where the modules may depend on parametric and nonparametric components. In such cases, a joint Bayesian inference is highly susceptible to misspecification across any module, and inappropriate priors for nonparametric components may deliver subpar inferences for parametric components, and vice versa. We propose a novel ``optimization-centric'' approach to cutting feedback for semiparametric modular inference, which can address misspecification and prior-data conflicts. The proposed generalized cut posteriors are defined through a variational optimization problem for generalized posteriors where regularization is based on R\'{e}nyi divergence, rather than Kullback-Leibler divergence (KLD), and variational computational methods are developed. We show empirically that using R\'{e}nyi divergence to define the cut posterior delivers more robust inferences than KLD. We derive novel posterior concentration results that accommodate the R\'{e}nyi divergence and allow for semiparametric components, greatly extending existing results for cut posteriors that were derived for parametric models and KLD. We demonstrate these new methods in a benchmark toy example and two real examples: Gaussian process adjustments for confounding in causal inference and misspecified copula models with nonparametric marginals.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.