Papers
Topics
Authors
Recent
2000 character limit reached

Asymptotic Breakdown Point Analysis for a General Class of Minimum Divergence Estimators (2304.07466v3)

Published 15 Apr 2023 in math.ST and stat.TH

Abstract: Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to classical techniques based on maximum likelihood and related methods. Basu et al. (1998) introduced the density power divergence (DPD) family as a measure of discrepancy between two probability density functions and used this family for robust estimation of the parameter for independent and identically distributed data. Ghosh et al. (2017) proposed a more general class of divergence measures, namely the S-divergence family and discussed its usefulness in robust parametric estimation through several asymptotic properties and some numerical illustrations. In this paper, we develop the results concerning the asymptotic breakdown point for the minimum S-divergence estimators (in particular the minimum DPD estimator) under general model setups. The primary result of this paper provides lower bounds to the asymptotic breakdown point of these estimators which are independent of the dimension of the data, in turn corroborating their usefulness in robust inference under high dimensional data.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.