Papers
Topics
Authors
Recent
Search
2000 character limit reached

Learning Coarse-Grained Dynamics on Graph

Published 15 May 2024 in math.NA, cond-mat.dis-nn, cs.LG, and cs.NA | (2405.09324v2)

Abstract: We consider a Graph Neural Network (GNN) non-Markovian modeling framework to identify coarse-grained dynamical systems on graphs. Our main idea is to systematically determine the GNN architecture by inspecting how the leading term of the Mori-Zwanzig memory term depends on the coarse-grained interaction coefficients that encode the graph topology. Based on this analysis, we found that the appropriate GNN architecture that will account for $K$-hop dynamical interactions has to employ a Message Passing (MP) mechanism with at least $2K$ steps. We also deduce that the memory length required for an accurate closure model decreases as a function of the interaction strength under the assumption that the interaction strength exhibits a power law that decays as a function of the hop distance. Supporting numerical demonstrations on two examples, a heterogeneous Kuramoto oscillator model and a power system, suggest that the proposed GNN architecture can predict the coarse-grained dynamics under fixed and time-varying graph topologies.

Authors (4)
Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. Interaction networks for learning about objects, relations and physics. Advances in neural information processing systems, 29, 2016.
  2. Spectral networks and locally connected networks on graphs. arXiv:1312.6203, 2013.
  3. Optimal prediction and the Mori–Zwanzig representation of irreversible processes. Proceedings of the National Academy of Sciences, 97(7):2968–2973, 2000.
  4. Combining differentiable PDE solvers and graph neural networks for fluid flow prediction. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 2402–2411. PMLR, 13–18 Jul 2020.
  5. Convolutional neural networks on graphs with fast localized spectral filtering. arXiv:1606.09375, 2017.
  6. Graph transformation policy network for chemical reaction prediction. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pages 750–760, 2019.
  7. Fast graph representation learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
  8. William L Hamilton. Graph representation learning. Morgan & Claypool Publishers, 2020.
  9. Parametric reduced models for the nonlinear schrödinger equation. Physical Review E, 91(5):053306, 2015.
  10. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
  11. Adam: A method for stochastic optimization, 2017.
  12. Yoshiki Kuramoto. International symposium on mathematical problems in theoretical physics. Lecture notes in Physics, 30:420, 1975.
  13. Chaos, fractals, and noise: stochastic aspects of dynamics, volume 97. Springer Science & Business Media, 2013.
  14. Data-driven parameterization of the generalized langevin equation. Proceedings of the National Academy of Sciences, 113(50):14183–14188, 2016.
  15. Global stability analysis using the eigenfunctions of the koopman operator. IEEE Transactions on Automatic Control, 61(11):3356–3369, 2016.
  16. Introduction to the koopman operator in dynamical systems and control theory. The koopman operator in systems and control: concepts, methodologies, and applications, pages 3–33, 2020.
  17. H. Mori. Transport, collective motion, and Brownian motion. Prog. Theor. Phys., 33:423 – 450, 1965.
  18. Pytorch: An imperative style, high-performance deep learning library. In H. Wallach, H. Larochelle, A. Beygelzimer, F. dAlché Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems 32, pages 8024–8035. Curran Associates, Inc., 2019.
  19. The graph neural network model. IEEE Transactions on Neural Networks, 20(1):61–80, 2009.
  20. Panagiotis Stinis. Higher order mori–zwanzig models for the euler equations. Multiscale Modeling & Simulation, 6(3):741–760, 2007.
  21. Sequence to sequence learning with neural networks. In Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 2, NIPS’14, page 3104–3112, Cambridge, MA, USA, 2014. MIT Press.
  22. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
  23. Pidgeun: Graph neural network-enabled transient dynamics prediction of networked microgrids through full-field measurement. IEEE Access, 12:49464–49475, 2024.
  24. Graph neural networks: A review of methods and applications. AI open, 1:57–81, 2020.
  25. R. Zwanzig. Nonlinear generalized Langevin equations. J. Stat. Phys., 9:215 – 220, 1973.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.