Bayesian Bridge Gaussian Process Regression (2511.17415v1)
Abstract: The performance of Gaussian Process (GP) regression is often hampered by the curse of dimensionality, which inflates computational cost and reduces predictive power in high-dimensional problems. Variable selection is thus crucial for building efficient and accurate GP models. Inspired by Bayesian bridge regression, we propose the Bayesian Bridge Gaussian Process Regression (B\textsuperscript{2}GPR) model. This framework places $\ell_q$-norm constraints on key GP parameters to automatically induce sparsity and identify active variables. We formulate two distinct versions: one for $q=2$ using conjugate Gaussian priors, and another for $0<q<2$ that employs constrained flat priors, leading to non-standard, norm-constrained posterior distributions. To enable posterior inference, we design a Gibbs sampling algorithm that integrates Spherical Hamiltonian Monte Carlo (SphHMC) to efficiently sample from the constrained posteriors when $0<q<2$. Simulations and a real-data application confirm that B\textsuperscript{2}GPR offers superior variable selection and prediction compared to alternative approaches.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.