Robust Comparative Statics with Misspecified Bayesian Learning (2407.17037v2)
Abstract: We present novel monotone comparative statics results for steady-state behavior in a dynamic optimization environment with misspecified Bayesian learning. Building on \cite{ep21a}, we analyze a Bayesian learner whose prior is over parameterized transition models but is misspecified in the sense that the true process does not belong to this set. We characterize conditions that ensure monotonicity in the steady-state distribution over states, actions, and inferred models. Additionally, we provide a new monotonicity-based proof of steady-state existence, derive an upper bound on the cost of misspecification, and illustrate the applicability of our results to several environments of general interest.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.