Papers
Topics
Authors
Recent
2000 character limit reached

On the super-efficiency and robustness of the least squares of depth-trimmed regression estimator (2501.14791v1)

Published 10 Jan 2025 in stat.AP

Abstract: The least squares of depth-trimmed (LST) residuals regression, proposed and studied in Zuo and Zuo (2023), serves as a robust alternative to the classic least squares (LS) regression as well as a strong competitor to the renowned robust least trimmed squares (LTS) regression of Rousseeuw (1984). The aim of this article is three-fold. (i) to reveal the super-efficiency of the LST and demonstrate it can be as efficient as (or even more efficient than) the LS in the scenarios with errors uncorrelated and mean zero and homoscedastic with finite variance and to explain this anti-Gaussian-Markov-Theorem phenomenon; (ii) to demonstrate that the LST can outperform the LTS, the benchmark of robust regression estimator, on robustness, and the MM of Yohai (1987), the benchmark of efficient and robust estimator, on both efficiency and robustness, consequently, could serve as an alternative to both; (iii) to promote the implementation and computation of the LST regression for a broad group of statisticians in statistical practice and to demonstrate that it can be computed as fast as (or even faster than) the LTS based on a newly improved algorithm.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.