Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 183 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 82 tok/s Pro
Kimi K2 213 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

On minimum Bregman divergence inference (2008.06987v1)

Published 16 Aug 2020 in math.ST, stat.ME, and stat.TH

Abstract: In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed. The popular density power divergence (DPD) class of estimators is a sub-class of Bregman divergences. We propose and study a new sub-class of Bregman divergences called the exponentially weighted divergence (EWD). Like the minimum DPD estimator, the minimum EWD estimator is recognised as an M-estimator. This characterisation is useful while discussing the asymptotic behaviour as well as the robustness properties of this class of estimators. Performances of the two classes are compared -- both through simulations as well as through real life examples. We develop an estimation process not only for independent and homogeneous data, but also for non-homogeneous data. General tests of parametric hypotheses based on the Bregman divergences are also considered. We establish the asymptotic null distribution of our proposed test statistic and explore its behaviour when applied to real data. The inference procedures generated by the new EWD divergence appear to be competitive or better that than the DPD based procedures.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.