Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Minimization Problems Based on Relative $α$-Entropy I: Forward Projection (1410.2346v3)

Published 9 Oct 2014 in cs.IT, math.IT, math.ST, and stat.TH

Abstract: Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative $\alpha$-entropies (denoted $\mathscr{I}{\alpha}$), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $\alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimizers of these relative $\alpha$-entropies on closed and convex sets are shown to exist. Such minimizations generalize the maximum R\'{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed forward $\mathscr{I}{\alpha}$-projection) for a linear family is shown to obey a power-law. Other results in connection with statistical inference, namely subspace transitivity and iterated projections, are also established. In a companion paper, a related minimization problem of interest in robust statistics that leads to a reverse $\mathscr{I}_{\alpha}$-projection is studied.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. M. Ashok Kumar (15 papers)
  2. Rajesh Sundaresan (49 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.