Nonparametric Shrinkage Estimation in High Dimensional Generalized Linear Models via Polya Trees
Abstract: Regularization in fitting regression models has been a highly active topic of research in the past few decades, but most of the existing methods are designed for particular situations, e.g. for the case of a sparse coefficient vector. We consider the problem of designing $\textit{universally}$ optimal regularized estimators in a given generalized linear model with fixed effects. First, we propose as a contender the Bayes estimator against an $\textit{ideal}$ prior that assigns equal mass to every permutation of the fixed coefficient vector, thus depending on the true coefficients only through their empirical CDF. We prove some optimality properties of this oracle estimator in both the frequentist and Bayesian frameworks. To compete with the oracle estimator, we posit a hierarchical Bayes model where the individual coefficients are modeled as i.i.d. draws from a common distribution $\pi$, which is in turn assigned a Polya tree prior to reflect indefiniteness. We demonstrate in examples that the posterior mean of $\pi$ under the postulated model adapts nonparametrically to the empirical CDF of the true coefficients. Correspondingly, the posterior means of the coefficients themselves are used to mimic the ideal estimator. Numerical experiments show that our method has better estimation and prediction accuracy compared to various parametric and nonparametric alternatives, from relatively standard $L_p$-regularized estimators to modern penalized-likelihood and Bayesian estimators for high dimensional regression.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.