Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Information erasure lurking behind measures of complexity (0905.2918v2)

Published 18 May 2009 in physics.data-an, cond-mat.stat-mech, and cs.CC

Abstract: Complex systems are found in most branches of science. It is still argued how to best quantify their complexity and to what end. One prominent measure of complexity (the statistical complexity) has an operational meaning in terms of the amount of resources needed to forecasting a system's behaviour. Another one (the effective measure complexity, aka excess entropy) is a measure of mutual information stored in the system proper. We show that for any given system the two measures differ by the amount of information erased during forecasting. We interpret the difference as inefficiency of a given model. We find a bound to the ratio of the two measures defined as information-processing efficiency, in analogy to the second law of thermodynamics. This new link between two prominent measures of complexity provides a quantitative criterion for good models of complex systems, namely those with little information erasure.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Karoline Wiesner (18 papers)
  2. Mile Gu (105 papers)
  3. Elisabeth Rieper (4 papers)
  4. Vlatko Vedral (193 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.