Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian mixture modeling using a mixture of finite mixtures with normalized inverse Gaussian weights (2501.18854v1)

Published 31 Jan 2025 in stat.ME

Abstract: In Bayesian inference for mixture models with an unknown number of components, a finite mixture model is usually employed that assumes prior distributions for mixing weights and the number of components. This model is called a mixture of finite mixtures (MFM). As a prior distribution for the weights, a (symmetric) Dirichlet distribution is widely used for conjugacy and computational simplicity, while the selection of the concentration parameter influences the estimate of the number of components. In this paper, we focus on estimating the number of components. As a robust alternative Dirichlet weights, we present a method based on a mixture of finite mixtures with normalized inverse Gaussian weights. The motivation is similar to the use of normalized inverse Gaussian processes instead of Dirichlet processes for infinite mixture modeling. Introducing latent variables, the posterior computation is carried out using block Gibbs sampling without using the reversible jump algorithm. The performance of the proposed method is illustrated through some numerical experiments and real data examples, including clustering, density estimation, and community detection.

Summary

We haven't generated a summary for this paper yet.