Posterior Projection for Inference in Constrained Spaces (1812.05741v5)
Abstract: Estimation of parameters that obey specific constraints is crucial in statistics and machine learning; for example, when parameters are required to satisfy boundedness, monotonicity, or linear inequalities. Traditional approaches impose these constraints via constraint-specific transformations or by truncating the posterior distribution. Such methods often result in computational challenges, limited flexibility, and a lack of generality. We propose a generalized framework for constrained Bayesian inference by projecting the unconstrained posterior distribution into the space of the parameter constraints, providing a computationally efficient and easily implementable solution for a large class of problems. We rigorously establish the theoretical foundations of the projected posterior distribution, as well as providing asymptotic results for posterior consistency, posterior contraction, and optimal coverage properties. Our methodology is validated through both theoretical arguments and practical applications, including bounded-monotonic regression and emulation of a computer model with directional outputs.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.