Papers
Topics
Authors
Recent
2000 character limit reached

Non-Asymptotic Guarantees for Robust Statistical Learning under Infinite Variance Assumption

Published 10 Jan 2022 in stat.ML, cs.LG, math.ST, and stat.TH | (2201.03182v2)

Abstract: There has been a surge of interest in developing robust estimators for models with heavy-tailed and bounded variance data in statistics and machine learning, while few works impose unbounded variance. This paper proposes two type of robust estimators, the ridge log-truncated M-estimator and the elastic net log-truncated M-estimator. The first estimator is applied to convex regressions such as quantile regression and generalized linear models, while the other one is applied to high dimensional non-convex learning problems such as regressions via deep neural networks. Simulations and real data analysis demonstrate the {robustness} of log-truncated estimations over standard estimations.

Citations (7)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.