Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonextensive Generalizations of the Jensen-Shannon Divergence (0804.1653v1)

Published 10 Apr 2008 in cs.IT, math.IT, math.ST, and stat.TH

Abstract: Convexity is a key concept in information theory, namely via the many implications of Jensen's inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen's inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building blocks: convexity and Shannon's entropy. In particular, a new concept of q-convexity is introduced and shown to satisfy a Jensen's q-inequality. Based on this Jensen's q-inequality, the Jensen-Tsallis q-difference is built, which is a nonextensive generalization of the JSD, based on Tsallis entropies. Finally, the Jensen-Tsallis q-difference is charaterized in terms of convexity and extrema.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. André Martins (10 papers)
  2. Pedro Aguiar (2 papers)
  3. Mário Figueiredo (9 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.