Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
92 tokens/sec
Gemini 2.5 Pro Premium
51 tokens/sec
GPT-5 Medium
24 tokens/sec
GPT-5 High Premium
17 tokens/sec
GPT-4o
97 tokens/sec
DeepSeek R1 via Azure Premium
92 tokens/sec
GPT OSS 120B via Groq Premium
458 tokens/sec
Kimi K2 via Groq Premium
222 tokens/sec
2000 character limit reached

Exact computation of Transfer Entropy with Path Weight Sampling (2409.01650v3)

Published 3 Sep 2024 in q-bio.MN, cond-mat.soft, cond-mat.stat-mech, cs.IT, math.IT, and physics.bio-ph

Abstract: The ability to quantify the directional flow of information is vital to understanding natural systems and designing engineered information-processing systems. A widely used measure to quantify this information flow is the transfer entropy. However, until now, this quantity could only be obtained in dynamical models using approximations that are typically uncontrolled. Here we introduce a computational algorithm called Transfer Entropy-Path Weight Sampling (TE-PWS), which makes it possible, for the first time, to quantify the transfer entropy and its variants exactly for any stochastic model, including those with multiple hidden variables, nonlinearity, transient conditions, and feedback. By leveraging techniques from polymer and path sampling, TE-PWS efficiently computes the transfer entropy as a Monte-Carlo average over signal trajectory space. We apply TE-PWS to linear and nonlinear systems to reveal how transfer entropy can overcome naive applications of the data processing inequality in the presence of feedback.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. T. Schreiber, Measuring information transfer, Physical review letters 85, 461 (2000).
  2. J. Runge, Causal network reconstruction from time series: From theoretical assumptions to practical estimation, Chaos: An Interdisciplinary Journal of Nonlinear Science 28 (2018).
  3. M. Prokopenko and J. T. Lizier, Transfer entropy and transient limits of computation, Scientific reports 4, 5394 (2014).
  4. J. M. Horowitz and H. Sandberg, Second-law-like inequalities with information and their interpretations, New Journal of Physics 16, 125007 (2014).
  5. F. Tostevin and P. R. Ten Wolde, Mutual information between input and output trajectories of biochemical networks, Physical review letters 102, 218101 (2009).
  6. A.-L. Moor and C. Zechner, Dynamic information transfer in stochastic biochemical networks, Physical Review Research 5, 013032 (2023).
  7. M. Reinhardt, G. Tkačik, and P. R. Ten Wolde, Path weight sampling: Exact monte carlo computation of the mutual information between stochastic trajectories, Physical Review X 13, 041017 (2023).
  8. L. Onsager and S. Machlup, Fluctuations and irreversible processes, Phys. Rev. 91, 1505 (1953).
  9. D. Frenkel and B. Smit, Understanding molecular simulation: from algorithms to applications, Vol. 1 (Elsevier, 2001).
  10. J. Massey et al., Causality, feedback and directed information, in Proc. Int. Symp. Inf. Theory Applic.(ISITA-90), Vol. 2 (1990).
  11. N. G. Van Kampen, Stochastic processes in physics and chemistry, Vol. 1 (Elsevier, 1992).
  12. D. T. Gillespie, A general method for numerically simulating the stochastic time evolution of coupled chemical reactions, Journal of computational physics 22, 403 (1976).
  13. T. M. Cover, Elements of information theory (John Wiley & Sons, 1999).
  14. R. G. James, N. Barnett, and J. P. Crutchfield, Information flows? a critique of transfer entropies, Physical review letters 116, 238701 (2016).
  15. M. S. Derpich and J. Østergaard, Directed data-processing inequalities for systems with feedback, Entropy 23, 533 (2021).
  16. P. B. Warren and P. R. Ten Wolde, Chemical models of genetic toggle switches, The Journal of Physical Chemistry B 109, 6812 (2005).
  17. A. Das and P. R. Ten Wolde, Computing exact transfer entropy with path weight sampling, 10.5281/zenodo.13617365 (2024).
  18. A. Das and D. T. Limmer, Variational control forces for enhanced sampling of nonequilibrium molecular dynamics simulations, The Journal of chemical physics 151, 244123 (2019).
  19. J. S. Lee, J.-M. Park, and H. Park, Thermodynamic uncertainty relation for underdamped langevin systems driven by a velocity-dependent force, Physical Review E 100, 062132 (2019).
  20. G. Gundersen, https://gregorygundersen.com/blog/2020/02/09/log-sum-exp/ (2020).
  21. R. Douc and O. Cappé, Comparison of resampling schemes for particle filtering, in ISPA 2005. Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis, 2005. (Ieee, 2005) pp. 64–69.
  22. M. Gehri, N. Engelmann, and H. Koeppl, Mutual information of a class of poisson-type channels using markov renewal theory, arXiv preprint arXiv:2403.15221  (2024).
  23. D. P. Shorten, R. E. Spinney, and J. T. Lizier, Estimating transfer entropy in continuous time between neural spike trains or other event-based data, PLoS computational biology 17, e1008054 (2021).
  24. J. Sun and E. M. Bollt, Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings, Physica D: Nonlinear Phenomena 267, 49 (2014).
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com