Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

High-dimensional unsupervised classification via parsimonious contaminated mixtures (1408.2128v4)

Published 9 Aug 2014 in stat.ME, stat.AP, and stat.CO

Abstract: The contaminated Gaussian distribution represents a simple heavy-tailed elliptical generalization of the Gaussian distribution; unlike the often-considered t-distribution, it also allows for automatic detection of mild outlying or "bad" points in the same way that observations are typically assigned to the groups in the finite mixture model context. Starting from this distribution, we propose the contaminated factor analysis model as a method for dimensionality reduction and detection of bad points in higher dimensions. A mixture of contaminated Gaussian factor analyzers (MCGFA) model follows therefrom, and extends the recently proposed mixture of contaminated Gaussian distributions to high-dimensional data. We introduce a family of 32 parsimonious models formed by introducing constraints on the covariance and contamination structures of the general MCGFA model. We outline a variant of the expectation-maximization algorithm for parameter estimation. Various implementation issues are discussed, and the novel family of models is compared to well-established approaches on both simulated and real data.

Summary

We haven't generated a summary for this paper yet.