Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A note on the quasiconvex Jensen divergences and the quasiconvex Bregman divergences derived thereof (1909.08857v2)

Published 19 Sep 2019 in cs.IT, cs.LG, and math.IT

Abstract: We first introduce the class of strictly quasiconvex and strictly quasiconcave Jensen divergences which are oriented (asymmetric) distances, and study some of their properties. We then define the strictly quasiconvex Bregman divergences as the limit case of scaled and skewed quasiconvex Jensen divergences, and report a simple closed-form formula which shows that these divergences are only pseudo-divergences at countably many inflection points of the generators. To remedy this problem, we propose the $\delta$-averaged quasiconvex Bregman divergences which integrate the pseudo-divergences over a small neighborhood in order obtain a proper divergence. The formula of $\delta$-averaged quasiconvex Bregman divergences extend even to non-differentiable strictly quasiconvex generators. These quasiconvex Bregman divergences between distinct elements have the property to always have one orientation finite while the other orientation is infinite. We show that these quasiconvex Bregman divergences can also be interpreted as limit cases of generalized skewed Jensen divergences with respect to comparative convexity by using power means. Finally, we illustrate how these quasiconvex Bregman divergences naturally appear as equivalent divergences for the Kullback-Leibler divergences between probability densities belonging to a same parametric family of distributions with nested supports.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Frank Nielsen (125 papers)
  2. Gaëtan Hadjeres (24 papers)

Summary

Overview of the Paper: Quasiconvex Jensen and Bregman Divergences

The paper "A note on the quasiconvex Jensen divergences and the quasiconvex Bregman divergences derived thereof," authored by Frank Nielsen and Gaëtan Hadjeres, introduces the notion of divergences based on quasiconvex functions. This work offers a theoretical examination of these divergences, extending the classical framework through profound mathematical insights.

Introduction to Quasiconvex Divergences

The authors begin by defining the class of strictly quasiconvex and quasiconcave Jensen divergences. A quasiconvex function relaxes the traditional strict convexity by allowing function values to peak and then decrease. These types of divergences are asymmetric, and the paper explores the properties these asymmetries can induce. The central idea hinges on calculating the divergence between two objects, relying on a generator function that is strictly quasiconvex.

Quasiconvex Bregman Divergences

Building on the quasiconvex Jensen divergences, the paper defines quasiconvex Bregman divergences. These are derived as limit cases through scaling of quasiconvex Jensen divergences. Notably, a significant result obtained is a simple closed-form solution illustrating that these divergences resolve as pseudo-divergences at countably many inflection points of the generator.

An innovation introduced by the authors is the concept of δ\delta-averaged quasiconvex Bregman divergences. This constructs proper divergences by averaging the pseudo-divergences over a minimal neighborhood region, extending even to strictly quasiconvex generators that are non-differentiable.

Case Studies and Relationships

The paper further expounds upon these concepts by considering specific instances such as comparative convexity, utilizing power means. The authors illustrate how quasiconvex Bregman divergences manifest naturally as limit cases within this generalized divergence framework. Furthermore, the divergences are aligned with statistical measures like the Kullback-Leibler divergence in parametric families of distributions with nested supports, showcasing both theoretical and practical implications in statistical tests and information geometry.

Implications and Future Work

The introduction and formal definition of quasiconvex divergences underscore their potential applications in clustering and classification problems, where asymmetry in measures is advantageous. The paper suggests that these divergences may contribute improvements in machine learning algorithms, particularly in modelling tasks where non-linear and asymmetric data distributions are prevalent.

Further research may explore the integration of these divergences in large-scale data analytics and optimization problems, potentially enhancing the performance of algorithms such as kk-means clustering or hierarchical clustering.

In conclusion, this paper provides a comprehensive analysis and formalization of quasiconvex divergences, paving the way for their application in various domains of artificial intelligence and statistical inference.

X Twitter Logo Streamline Icon: https://streamlinehq.com