Two tales for a geometric Jensen--Shannon divergence (2508.05066v1)
Abstract: The geometric Jensen--Shannon divergence (G-JSD) gained popularity in machine learning and information sciences thanks to its closed-form expression between Gaussian distributions. In this work, we introduce an alternative definition of the geometric Jensen--Shannon divergence tailored to positive densities which does not normalize geometric mixtures. This novel divergence is termed the extended G-JSD as it extends to more general positive measures. We give explicitly the gap between the extended G-JSD and G-JSD when considering probability densities, and report both lower and upper bounds in terms of other statistical divergences. We derive corresponding closed-form expressions when considering the case of multivariate Gaussian distributions often met in applications. Finally, we show that these two types of geometric JSDs, the G-JSD and the extended G-JSD, can be interpreted as regularizations of the ordinary JSD by additive terms.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.