Entropy Production and Information Flow for Markov Diffusions with Filtering
Abstract: Filtering theory gives an explicit models for the flow of information and thereby quantifies the rates of change of information supplied to and dissipated from the filter's memory. Here we extend the analysis of Mitter and Newton from linear Gaussian models to general nonlinear filters involving Markov diffusions.The rates of entropy production are now generally the average squared-field (co-metric) of various logarithmic probability densities, which may be interpreted as Fisher information associate with Gaussian perturbations (via de Bruijn's identity). We show that the central connection is made through the Mayer-Wolf and Zakai Theorem for the rate of change of the mutual information between the filtered state and the observation history. In particular, we extend this Theorem to cover a Markov diffusion controlled by observations process, which may be interpreted as the filter acting as a Maxwell's Daemon applying feedback to the system.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.