Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
112 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Flat-topped Probability Density Functions for Mixture Models (2203.17027v1)

Published 31 Mar 2022 in cs.LG, math.PR, math.ST, stat.ML, and stat.TH

Abstract: This paper investigates probability density functions (PDFs) that are continuous everywhere, nearly uniform around the mode of distribution, and adaptable to a variety of distribution shapes ranging from bell-shaped to rectangular. From the viewpoint of computational tractability, the PDF based on the Fermi-Dirac or logistic function is advantageous in estimating its shape parameters. The most appropriate PDF for $n$-variate distribution is of the form: $p\left(\mathbf{x}\right)\propto\left[\cosh\left(\left[\left(\mathbf{x}-\mathbf{m}\right){\mathsf{T}}\boldsymbol{\Sigma}{-1}\left(\mathbf{x}-\mathbf{m}\right)\right]{n/2}\right)+\cosh\left(r{n}\right)\right]{-1}$ where $\mathbf{x},\mathbf{m}\in\mathbb{R}{n}$, $\boldsymbol{\Sigma}$ is an $n\times n$ positive definite matrix, and $r>0$ is a shape parameter. The flat-topped PDFs can be used as a component of mixture models in machine learning to improve goodness of fit and make a model as simple as possible.

Summary

We haven't generated a summary for this paper yet.