Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Gibbs Sampling Architecture for Labeled Random Finite Sets Multi-Object Tracking (2306.15135v1)

Published 27 Jun 2023 in eess.SY, cs.SY, and eess.SP

Abstract: Gibbs sampling is one of the most popular Markov chain Monte Carlo algorithms because of its simplicity, scalability, and wide applicability within many fields of statistics, science, and engineering. In the labeled random finite sets literature, Gibbs sampling procedures have recently been applied to efficiently truncate the single-sensor and multi-sensor $\delta$-generalized labeled multi-Bernoulli posterior density as well as the multi-sensor adaptive labeled multi-Bernoulli birth distribution. However, only a limited discussion has been provided regarding key Gibbs sampler architecture details including the Markov chain Monte Carlo sample generation technique and early termination criteria. This paper begins with a brief background on Markov chain Monte Carlo methods and a review of the Gibbs sampler implementations proposed for labeled random finite sets filters. Next, we propose a short chain, multi-simulation sample generation technique that is well suited for these applications and enables a parallel processing implementation. Additionally, we present two heuristic early termination criteria that achieve similar sampling performance with substantially fewer Markov chain observations. Finally, the benefits of the proposed Gibbs samplers are demonstrated via two Monte Carlo simulations.

Citations (1)

Summary

We haven't generated a summary for this paper yet.