Particle Filtering and Smoothing Using Windowed Rejection Sampling (1407.4414v1)
Abstract: "Particle methods" are sequential Monte Carlo algorithms, typically involving importance sampling, that are used to estimate and sample from joint and marginal densities from a collection of a, presumably increasing, number of random variables. In particular, a particle filter aims to estimate the current state $X_{n}$ of a stochastic system that is not directly observable by estimating a posterior distribution $\pi(x_{n}|y_{1},y_{2}, \ldots, y_{n})$ where the ${Y_{n}}$ are observations related to the ${X_{n}}$ through some measurement model $\pi(y_{n}|x_{n})$. A particle smoother aims to estimate a marginal distribution $\pi(x_{i}|y_{1},y_{2}, \ldots, y_{n})$ for $1 \leq i < n$. Particle methods are used extensively for hidden Markov models where ${X_{n}}$ is a Markov chain as well as for more general state space models. Existing particle filtering algorithms are extremely fast and easy to implement. Although they suffer from issues of degeneracy and "sample impoverishment", steps can be taken to minimize these problems and overall they are excellent tools for inference. However, if one wishes to sample from a posterior distribution of interest, a particle filter is only able to produce dependent draws. Particle smoothing algorithms are complicated and far less robust, often requiring cumbersome post-processing, "forward-backward" recursions, and multiple passes through subroutines. In this paper we introduce an alternative algorithm for both filtering and smoothing that is based on rejection sampling "in windows" . We compare both speed and accuracy of the traditional particle filter and this "windowed rejection sampler" (WRS) for several examples and show that good estimates for smoothing distributions are obtained at no extra cost.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.