Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variable selection in high-dimensional linear model with possibly asymmetric or heavy-tailed errors (1812.03121v1)

Published 7 Dec 2018 in math.ST and stat.TH

Abstract: We consider the problem of automatic variable selection in a linear model with asymmetric or heavy-tailed errors when the number of explanatory variables diverges with the sample size. For this high-dimensional model, the penalized least square method is not appropriate and the quantile framework makes the inference more difficult because to the non differentiability of the loss function. We propose and study an estimation method by penalizing the expectile process with an adaptive LASSO penalty. Two cases are considered: the number of model parameters is smaller and afterwards larger than the sample size, the two cases being distinct by the adaptive penalties considered. For each case we give the rate convergence and establish the oracle properties of the adaptive LASSO expectile estimator. The proposed estimators are evaluated through Monte Carlo simulations and compared with the adaptive LASSO quantile estimator. We applied also our estimation method to real data in genetics when the number of parameters is greater than the sample size.

Summary

We haven't generated a summary for this paper yet.