Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 71 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 17 tok/s Pro
GPT-4o 111 tok/s Pro
Kimi K2 161 tok/s Pro
GPT OSS 120B 412 tok/s Pro
Claude Sonnet 4 35 tok/s Pro
2000 character limit reached

Can local particle filters beat the curse of dimensionality? (1301.6585v2)

Published 28 Jan 2013 in math.ST, math.PR, and stat.TH

Abstract: The discovery of particle filtering methods has enabled the use of nonlinear filtering in a wide array of applications. Unfortunately, the approximation error of particle filters typically grows exponentially in the dimension of the underlying model. This phenomenon has rendered particle filters of limited use in complex data assimilation problems. In this paper, we argue that it is often possible, at least in principle, to develop local particle filtering algorithms whose approximation error is dimension-free. The key to such developments is the decay of correlations property, which is a spatial counterpart of the much better understood stability property of nonlinear filters. For the simplest possible algorithm of this type, our results provide under suitable assumptions an approximation error bound that is uniform both in time and in the model dimension. More broadly, our results provide a framework for the investigation of filtering problems and algorithms in high dimension.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.