2000 character limit reached
Average entropy of Gaussian mixtures (2404.07311v2)
Published 10 Apr 2024 in cs.IT and math.IT
Abstract: We calculate the average differential entropy of a $q$-component Gaussian mixture in $\mathbb Rn$. For simplicity, all components have covariance matrix $\sigma2 {\mathbf 1}$, while the means ${\mathbf{W}i}{i=1}{q}$ are i.i.d. Gaussian vectors with zero mean and covariance $s2 {\mathbf 1}$. We obtain a series expansion in $\mu=s2/\sigma2$ for the average differential entropy up to order $\mathcal{O}(\mu2)$, and we provide a recipe to calculate higher order terms. Our result provides an analytic approximation with a quantifiable order of magnitude for the error, which is not achieved in previous literature.
- Elements of information theory. John Wiley & Sons, 1999.
- Goldberger and Greenspan. An efficient image similarity measure based on approximations of KL-divergence between two Gaussian mixtures. In Proceedings Ninth IEEE International conference on computer vision, pages 487–493. IEEE, 2003.
- Approximating the Kullback Leibler divergence between Gaussian mixture models. In 2007 IEEE International Conference on Acoustics, Speech and Signal Processing-ICASSP’07, volume 4, pages IV–317. IEEE, 2007.
- On entropy approximation for Gaussian mixture random vectors. In 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pages 181–188. IEEE, 2008.
- Estimating mixture entropy with pairwise distances. Entropy, 19(7):361, 2017.
- Calculation of differential entropy for a mixed gaussian distribution. Entropy, 10(3):200, 2008.
- A series of maximum entropy upper bounds of the differential entropy. arXiv preprint arXiv:1612.02954, 2016.
- Frank Nielsen and Ke Sun. Guaranteed bounds on the Kullback–Leibler divergence of univariate mixtures. IEEE Signal Processing Letters, 23(11):1543–1546, 2016.