Non-null Shrinkage Regression and Subset Selection via the Fractional Ridge Regression
Abstract: $\ell_p$-norm penalization, notably the Lasso, has become a standard technique, extending shrinkage regression to subset selection. Despite aiming for oracle properties and consistent estimation, existing Lasso-derived methods still rely on shrinkage toward a null model, necessitating careful tuning parameter selection and yielding monotone variable selection. This research introduces Fractional Ridge Regression, a novel generalization of the Lasso penalty that penalizes only a fraction of the coefficients. Critically, Fridge shrinks the model toward a non-null model of a prespecified target size, even under extreme regularization. By selectively penalizing coefficients associated with less important variables, Fridge aims to reduce bias, improve performance relative to the Lasso, and offer more intuitive model interpretation while retaining certain advantages of best subset selection.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.