Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences (2001.04769v2)

Published 14 Jan 2020 in cs.IT, eess.SP, math.IT, math.ST, stat.ML, and stat.TH

Abstract: We study the geometry of probability distributions with respect to a generalized family of Csisz\'ar $f$-divergences. A member of this family is the relative $\alpha$-entropy which is also a R\'enyi analog of relative entropy in information theory and known as logarithmic or projective power divergence in statistics. We apply Eguchi's theory to derive the Fisher information metric and the dual affine connections arising from these generalized divergence functions. This enables us to arrive at a more widely applicable version of the Cram\'{e}r-Rao inequality, which provides a lower bound for the variance of an estimator for an escort of the underlying parametric probability distribution. We then extend the Amari-Nagaoka's dually flat structure of the exponential and mixer models to other distributions with respect to the aforementioned generalized metric. We show that these formulations lead us to find unbiased and efficient estimators for the escort model. Finally, we compare our work with prior results on generalized Cram\'er-Rao inequalities that were derived from non-information-geometric frameworks.

Citations (3)

Summary

We haven't generated a summary for this paper yet.