Papers
Topics
Authors
Recent
Search
2000 character limit reached

Entropy Production and Information Flow for Markov Diffusions with Filtering

Published 16 Oct 2017 in math-ph, math.MP, and math.PR | (1710.05553v1)

Abstract: Filtering theory gives an explicit models for the flow of information and thereby quantifies the rates of change of information supplied to and dissipated from the filter's memory. Here we extend the analysis of Mitter and Newton from linear Gaussian models to general nonlinear filters involving Markov diffusions.The rates of entropy production are now generally the average squared-field (co-metric) of various logarithmic probability densities, which may be interpreted as Fisher information associate with Gaussian perturbations (via de Bruijn's identity). We show that the central connection is made through the Mayer-Wolf and Zakai Theorem for the rate of change of the mutual information between the filtered state and the observation history. In particular, we extend this Theorem to cover a Markov diffusion controlled by observations process, which may be interpreted as the filter acting as a Maxwell's Daemon applying feedback to the system.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.