Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Consistency issues in Gaussian Mixture Models reduction algorithms (2104.12586v1)

Published 26 Apr 2021 in stat.ML, cs.IT, cs.LG, math.IT, and math.OC

Abstract: In many contexts Gaussian Mixtures (GM) are used to approximate probability distributions, possibly time-varying. In some applications the number of GM components exponentially increases over time, and reduction procedures are required to keep them reasonably limited. The GM reduction (GMR) problem can be formulated by choosing different measures of the dissimilarity of GMs before and after reduction, like the Kullback-Leibler Divergence (KLD) and the Integral Squared Error (ISE). Since in no case the solution is obtained in closed form, many approximate GMR algorithms have been proposed in the past three decades, although none of them provides optimality guarantees. In this work we discuss the importance of the choice of the dissimilarity measure and the issue of consistency of all steps of a reduction algorithm with the chosen measure. Indeed, most of the existing GMR algorithms are composed by several steps which are not consistent with a unique measure, and for this reason may produce reduced GMs far from optimality. In particular, the use of the KLD, of the ISE and normalized ISE is discussed and compared in this perspective.

Citations (6)

Summary

We haven't generated a summary for this paper yet.