2000 character limit reached
Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework (1102.1025v3)
Published 4 Feb 2011 in cond-mat.stat-mech, cs.IT, math-ph, math.IT, and math.MP
Abstract: The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pythagorean theorem is derived from the minimum discrimination information-principle using the dual generalized K-Ld as the measure of uncertainty, with constraints defined by normal averages. The minimization of the dual generalized K-Ld, with normal averages constraints, is shown to exhibit distinctly unique features.