Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Support estimation in high-dimensional heteroscedastic mean regression (2011.01591v1)

Published 3 Nov 2020 in math.ST, stat.ML, and stat.TH

Abstract: A current strand of research in high-dimensional statistics deals with robustifying the available methodology with respect to deviations from the pervasive light-tail assumptions. In this paper we consider a linear mean regression model with random design and potentially heteroscedastic, heavy-tailed errors, and investigate support estimation in this framework. We use a strictly convex, smooth variant of the Huber loss function with tuning parameter depending on the parameters of the problem, as well as the adaptive LASSO penalty for computational efficiency. For the resulting estimator we show sign-consistency and optimal rates of convergence in the $\ell_\infty$ norm as in the homoscedastic, light-tailed setting. In our analysis, we have to deal with the issue that the support of the target parameter in the linear mean regression model and its robustified version may differ substantially even for small values of the tuning parameter of the Huber loss function. Simulations illustrate the favorable numerical performance of the proposed methodology.

Summary

We haven't generated a summary for this paper yet.