Information flow and causality as rigorous notions ab initio (1503.08389v1)
Abstract: Information flow (or information transfer as may be called) the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality and, particularly, the most stringent one-way causality, if existing, is firmly substantiated and stated as a fact in proved theorems. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems of arbitrary dimensionality, both deterministic and stochastic. They have been obtained explicitly in closed form, and all possess the property of causality, which reads: if a component, say $x_i$, has an evolutionary law independent of $x_j$, then the information flow from $x_j$ to $x_i$ vanishes. These results have been put to applications with benchmark systems, such as the Kaplan-Yorke map, the R\"ossler system, the baker transformation, the H\'enon map, and a stochastic potential flow. Besides recovering the properties as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, resolving unambiguously the long-standing debate over causation versus correlation.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.