Fast Marginal Likelihood Estimation of the Ridge Parameter(s) in Ridge Regression and Generalized Ridge Regression for Big Data (1409.2437v5)
Abstract: Unlike the ordinary least-squares (OLS) estimator for the linear model, a ridge regression linear model provides coefficient estimates via shrinkage, usually with improved mean-square and prediction error. This is true especially when the observed design matrix is ill-conditioned or singular, either as a result of highly-correlated covariates or the number of covariates exceeding the sample size. This paper introduces novel and fast marginal maximum likelihood (MML) algorithms for estimating the shrinkage parameter(s) for the Bayesian ridge and power ridge regression models, and an automatic plug-in MML estimator for the Bayesian generalized ridge regression model. With the aid of the singular value decomposition of the observed covariate design matrix, these MML estimation methods are quite fast even for data sets where either the sample size (n) or the number of covariates (p) is very large, and even when p>n. On several real data sets varying widely in terms of n and p, the computation times of the MML estimation methods for the three ridge models, respectively, are compared with the times of other methods for estimating the shrinkage parameter in ridge, LASSO and Elastic Net (EN) models, with the other methods based on minimizing prediction error according to cross-validation or information criteria. Also, the ridge, LASSO, and EN models, and their associated estimation methods, are compared in terms of prediction accuracy. Furthermore, a simulation study compares the ridge models under MML estimation, against the LASSO and EN models, in terms of their ability to differentiate between truly-significant covariates (i.e., with non-zero slope coefficients) and truly-insignificant covariates (with zero coefficients).