Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-Bernoulli Mixture Filter: Complete Derivation and Sequential Monte Carlo Implementation (1911.03699v1)

Published 9 Nov 2019 in eess.SP

Abstract: Multi-Bernoulli mixture (MBM) filter is one of the exact closed-form multi-target Bayes filters in the random finite sets (RFS) framework, which utilizes multi-Bernoulli mixture density as the multi-target conjugate prior. This filter is the variant of Poisson multi-Bernoulli mixture filter when the birth process is changed to a multi-Bernoulli RFS or a multi-Bernoulli mixture RFS from a Poisson RFS. On the other hand, labeled multi-Bernoulli mixture filter evolves to MBM filter when the label is discarded. In this letter, we provide a complete derivation of MBM filter where the derivation of update step does not use the probability generating functional. We also describe the sequential Monte Carlo implementation and adopt Gibbs sampling for truncating the MBM filtering density. Numerical simulation with a nonlinear measurement model shows that MBM filter outperforms the classical probability hypothesis density filter.

Summary

We haven't generated a summary for this paper yet.