Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 155 tok/s
Gemini 2.5 Pro 43 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 20 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 184 tok/s Pro
GPT OSS 120B 446 tok/s Pro
Claude Sonnet 4.5 31 tok/s Pro
2000 character limit reached

Directed Redundancy in Time Series (2405.00368v1)

Published 1 May 2024 in cs.IT and math.IT

Abstract: We quantify the average amount of redundant information that is transferred from a subset of relevant random source processes to a target process. To identify the relevant source processes, we consider those that are connected to the target process and in addition share a certain proportion of the total information causally provided to the target. Even if the relevant processes have no directed information exchange between them, they can still causally provide redundant information to the target. This makes it difficult to identify the relevant processes. To solve this issue, we propose the existence of a hidden redundancy process that governs the shared information among the relevant processes. We bound the redundancy by the minimal average directed redundancy from the relevant processes to the target, from the hidden redundancy process to the target, and from the hidden redundancy process to the relevant processes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. R. Yeung, “A first course in information theory,” Springer, 2002.
  2. P. L. Williams and R. D. Beer, “Nonnegative decomposition of multivariate information,” CoRR, vol. abs/1004.2515, 2010.
  3. M. Harder, C. Salge, and D. Polani, “Bivariate measure of redundant information,” Physical Review E, vol. 87, no. 1, 2013.
  4. N. Bertschinger, J. Rauh, E. Olbrich, J. Jost, and N. Ay, “Quantifying unique information,” Entropy, vol. 16, p. 2161–2183, 2014.
  5. R. A. A. Ince, “Measuring multivariate redundant information with pointwise common change in surprisal,” Entropy, vol. 19, no. 7, 2017.
  6. R. G. James and J. P. Crutchfield., “Multivariate dependence beyond shannon information,” Entropy, 2017.
  7. M. Wibral, V. P. J. Kay, J. Lizier, and W. Philips, “Partial information decomposition as a unified approach to the specification of neural goal functions,” Brain and Cognition, vol. 112, 2017.
  8. J. Kunert-Graf, N. Sakhanenko, and D. Galas, “Partial information decomposition and the information delta: A geometric unification disentangling non-pairwise information,” Entropy, vol. 22, no. 1333, 2020.
  9. R. James, J. Emenheiser, and J. Crutchfield, “Unique information and secret key agreement,” MDPI Entropy, vol. 21, no. 1, 2021.
  10. P. Mediano, F. Rosas, A. Luppi, R. Carhart-Harris, D. Bor, A. Seth, and A. Barrett, “Towards an extended taxonomy of information dynamics via integrated information decomposition,” arxiv.org:, 2021.
  11. A. B. Barrett, “Exploration of synergistic and redundant information sharing in static and dynamical gaussian systems,” Phys. Rev. E., vol. 91, no. 052802, 2015.
  12. T. Schreiber, “Measuring information transfer,” Phys. Rev. Lett., vol. 85, 2000.
  13. L. Faes, D. Marinazzo, and S. Stramaglia, “Multiscale information decomposition: Exact computation for multivariate gaussian processes,” Entropy, 2017.
  14. A. Luppi, P. Mediano, F. Rosas, N. Holland, T. Fryer, J. O’Brien, J. Rowe, D. Menon, D.Bor, and E. Stamatakis, “A synergistic core for human brain evolution and cognition,” Nature Neuroscience, vol. 25, 2022.
  15. J. Massey, “Causality, feedback and directed information,” The International Symposium on Information Theory and Its Applications, 1990.
  16. G. Kramer, “Capacity results for the discrete memoryless network,” IEEE Transactions on Information Theory, vol. 49, no. 1, pp. 4–21, 2003.
  17. M. Lindner, R. Vicente, V. Priesemann, and M. Wibral, “Trentool: a matlab open source toolbox to analyse information flow in time series data with transfer entropy,” BMC Neurosci., vol. 12, p. 119, 2011.
  18. M. A. Kramer, E. D. Kolaczyk, and H. E. Kirsch, “Emergent network topology at seizure onset in humans,” Epilepsy Research, vol. 79, no. 173 – 186, 2008.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 2 tweets and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: