Algorithme EM régularisé (2307.01955v1)
Abstract: Expectation-Maximization (EM) algorithm is a widely used iterative algorithm for computing maximum likelihood estimate when dealing with Gaussian Mixture Model (GMM). When the sample size is smaller than the data dimension, this could lead to a singular or poorly conditioned covariance matrix and, thus, to performance reduction. This paper presents a regularized version of the EM algorithm that efficiently uses prior knowledge to cope with a small sample size. This method aims to maximize a penalized GMM likelihood where regularized estimation may ensure positive definiteness of covariance matrix updates by shrinking the estimators towards some structured target covariance matrices. Finally, experiments on real data highlight the good performance of the proposed algorithm for clustering purposes
- A. P. Dempster and N. M. Laird and D. B. Rubin Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society, 1977
- Guorong Xuan and Wei Zhang and Peiqi Chai EM algorithms of gaussian mixture model and hidden Markov model. Proceedings 2001 International Conference on Image Processing
- Mahdi Teimouri EM algorithm for mixture of skew-normal distributions fitted to grouped data. Journal of Applied Statistics, 2021
- Pierre Houdouin and Esa Ollila and Frederic Pascal Regularized EM algorithm. https://arxiv.org/abs/2303.14989, 2022