Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 204 tok/s Pro
GPT OSS 120B 433 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Large Deviation Asymptotics and Bayesian Posterior Consistency on Stochastic Processes and Dynamical Systems (2106.06894v2)

Published 13 Jun 2021 in math.ST, math.DS, math.PR, and stat.TH

Abstract: We consider generalized Bayesian inference on stochastic processes and dynamical systems with potentially long-range dependency. Given a sequence of observations, a class of parametrized model processes with a prior distribution, and a loss function, we specify the generalized posterior distribution. The problem of frequentist posterior consistency is concerned with whether as more and more samples are observed, the posterior distribution on parameters will asymptotically concentrate on the "right" parameters. We show that posterior consistency can be derived using a combination of classical large deviation techniques, such as Varadhan's lemma, conditional/quenched large deviations, annealed large deviations, and exponential approximations. We show that the posterior distribution will asymptotically concentrate on parameters that minimize the expected loss and a divergence term, and we identify the divergence term as the Donsker-Varadhan relative entropy rate from process-level large deviations. As an application, we prove new quenched and annealed large deviation asymptotics and new Bayesian posterior consistency results for a class of mixing stochastic processes. In the case of Markov processes, one can obtain explicit conditions for posterior consistency, whenever estimates for log-Sobolev constants are available, which makes our framework essentially a black box. We also recover state-of-the-art posterior consistency on classical dynamical systems with a simple proof. Our approach has the potential of proving posterior consistency for a wide range of Bayesian procedures in a unified way.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 4 tweets and received 34 likes.

Upgrade to Pro to view all of the tweets about this paper: