Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sparse dynamic network reconstruction through L1-regularization of a Lyapunov equation (2403.05457v2)

Published 8 Mar 2024 in eess.SY and cs.SY

Abstract: An important problem in many areas of science is that of recovering interaction networks from simultaneous time-series of many interacting dynamical processes. A common approach is to use the elements of the correlation matrix or its inverse as proxies of the interaction strengths, but the reconstructed networks are necessarily undirected. Transfer entropy methods have been proposed to reconstruct directed networks but the reconstructed network lacks information about interaction strengths. We propose a network reconstruction method that inherits the best of the two approaches by reconstructing a directed weighted network from noisy data under the assumption that the network is sparse and the dynamics are governed by a linear (or weakly-nonlinear) stochastic dynamical system. The two steps of our method are i) constructing an (infinite) family of candidate networks by solving the covariance matrix Lyapunov equation for the state matrix and ii) using L1-regularization to select a sparse solution. We further show how to use prior information on the (non)existence of a few directed edges to drastically improve the quality of the reconstruction.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. R. Liégeois, A. Santos, V. Matta, D. V. D. Ville, and A. H. Sayed, “Revisiting correlation-based functional connectivity and its relationship with structural connectivity,” Network Neuroscience, vol. 4, no. 4, p. 1235–1251, 2020.
  2. T. W. Lin, A. Das, G. P. Krishnan, M. Bazhenov, and T. J. Sejnowski, “Differential covariance: A new class of methods to estimate sparse connectivity from neural recordings,” Neural Computation, vol. 29, no. 10, p. 2581–2632, 2017.
  3. R. Mohanty, W. A. Sethares, V. A. Nair, and V. Prabhakaran, “Rethinking measures of functional connectivity via feature extraction,” Scientific Reports, vol. 10, no. 1, p. 1298, 2020.
  4. T. Schreiber, “Measuring information transfer,” Physical Review Letters, vol. 85, no. 2, p. 461–464, Jul. 2000.
  5. M. Lobier, F. Siebenhühner, S. Palva, and J. M. Palva, “Phase transfer entropy: A novel phase-based measure for directed connectivity in networks coupled by oscillatory interactions,” NeuroImage, vol. 85, p. 853–872, 2014.
  6. M. Ursino, G. Ricci, and E. Magosso, “Transfer entropy as a measure of brain connectivity: A critical analysis with the help of neural mass models,” Frontiers in Computational Neuroscience, vol. 14, 2020.
  7. “Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing,” vol. 3, p. 827–847, 2019.
  8. N. M. Timme and C. Lapish, “A tutorial for information theory in neuroscience,” eneuro, vol. 5, no. 3, pp. ENEURO.0052–18.2018, 2018.
  9. P. Sharma, D. J. Bucci, S. K. Brahma, and P. K. Varshney, “Communication network topology inference via transfer entropy,” IEEE Transactions on Network Science and Engineering, vol. 7, no. 1, p. 562–575, 2017.
  10. K. J. Friston, L. Harrison, and W. Penny, “Dynamic causal modelling,” Neuroimage, vol. 19, no. 4, pp. 1273–1302, 2003.
  11. U. Casti, G. Baggio, D. Benozzo, S. Zampieri, A. Bertoldo, and A. Chiuso, “Dynamic brain networks with prescribed functional connectivity,” in 2023 62nd IEEE Conference on Decision and Control (CDC).   IEEE, 2023, pp. 709–714.
  12. K. Fernando and H. Nicholson, “Solution of lyapunov equation for the state matrix,” Electronics Letters, vol. 17, no. 5, pp. 204–205, 1981.
  13. E. Candes and J. Romberg, “l1-magic: Recovery of sparse signals via convex programming,” URL: www. acm. caltech. edu/l1magic/downloads/l1magic. pdf, vol. 4, no. 14, p. 16, 2005.
  14. A. Rahimzamani and S. Kannan, “Network inference using directed information: The deterministic limit,” in 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton).   Monticello, IL, USA: IEEE, 2016, p. 156–163.
  15. L. Barnett, A. B. Barrett, and A. K. Seth, “Granger causality and transfer entropy are equivalent for gaussian variables,” Physical Review Letters, vol. 103, no. 23, p. 238701, 2009.
  16. P. L. Williams and R. D. Beer, “Generalized measures of information transfer,” no. arXiv:1102.1507, 2011, arXiv:1102.1507 [physics].
  17. P. Wollstadt, J. T. Lizier, R. Vicente, C. Finn, M. Martínez-Zarzuela, P. Mediano, L. Novelli, and M. Wibral, “Idtxl: The information dynamics toolkit xl: a python package for the efficient analysis of multivariate information dynamics in networks,” Journal of Open Source Software, vol. 4, no. 34, p. 1081, 2019, arXiv:1807.10459 [cs, math].
  18. S. Diamond and S. Boyd, “CVXPY: A Python-embedded modeling language for convex optimization,” Journal of Machine Learning Research, vol. 17, no. 83, pp. 1–5, 2016.
  19. R. Rossi-Pool, A. Zainos, M. Alvarez, S. Parra, J. Zizumbo, and R. Romo, “Invariant timescale hierarchy across the cortical somatosensory network,” Proceedings of the National Academy of Sciences, vol. 118, no. 3, p. e2021843118, 2021.

Summary

We haven't generated a summary for this paper yet.