Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 150 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 33 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 113 tok/s Pro
Kimi K2 211 tok/s Pro
GPT OSS 120B 444 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic Systems (2305.14642v3)

Published 24 May 2023 in cs.LG and cs.CE

Abstract: Reasoning system dynamics is one of the most important analytical approaches for many scientific studies. With the initial state of a system as input, the recent graph neural networks (GNNs)-based methods are capable of predicting the future state distant in time with high accuracy. Although these methods have diverse designs in modeling the coordinates and interacting forces of the system, we show that they actually share a common paradigm that learns the integration of the velocity over the interval between the initial and terminal coordinates. However, their integrand is constant w.r.t. time. Inspired by this observation, we propose a new approach to predict the integration based on several velocity estimations with Newton-Cotes formulas and prove its effectiveness theoretically. Extensive experiments on several benchmarks empirically demonstrate consistent and significant improvement compared with the state-of-the-art methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. Georges Louis Leclerc comte de Buffon. Histoire naturelle, générale et particuliére: De la manière d’étudier & de traiter l’histoire naturelle. Histoire & théorie de la terre. 1749, volume 1. L’Imprimerie Royale, 1774.
  2. Loup Verlet. Computer" experiments" on classical fluids. i. thermodynamical properties of lennard-jones molecules. Physical review, 159(1):98, 1967.
  3. Nonphysical sampling distributions in monte carlo free-energy estimation: Umbrella sampling. Journal of Computational Physics, 23(2):187–199, 1977.
  4. Understanding molecular simulation: from algorithms to applications, volume 1. Elsevier, 2001.
  5. Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning. Science, 365(6457):eaaw1147, 2019.
  6. Ilya A Vakser. Protein-protein docking: From interaction to interactome. Biophysical journal, 107(8):1785–1793, 2014.
  7. Deepmd-kit: A deep learning package for many-body potential energy representation and molecular dynamics. Computer Physics Communications, 228:178–184, 2018.
  8. Swarmdock: a server for flexible protein–protein docking. Bioinformatics, 29(6):807–809, 2013.
  9. Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
  10. Graph attention networks. In ICLR, 2018.
  11. Attention is all you need. Advances in neural information processing systems, 30, 2017a.
  12. Attention mechanism enhanced lstm with residual architecture and its application for protein-protein interaction residue pairs prediction. BMC bioinformatics, 20(1):1–11, 2019.
  13. Attention is all you need. In NeurIPS, 2017b.
  14. Lagrangian fluid simulation with continuous convolutions. In ICLR, 2019.
  15. Learning to simulate complex physics with graph networks. In ICML, pages 8459–8468. PMLR, 2020.
  16. Learning mesh-based simulation with graph networks. arXiv preprint arXiv:2010.03409, 2020.
  17. Equivariant flows: sampling configurations for multi-body systems with symmetric energies. arXiv preprint arXiv:1910.00753, 2019.
  18. E(n) equivariant graph neural networks. In ICML, volume 139 of Proceedings of Machine Learning Research, pages 9323–9332, 2021.
  19. Equivariant graph mechanics networks with constraints. In ICLR, 2022.
  20. Equivariant graph hierarchy-based neural networks. In ICLR, 2022. URL https://openreview.net/forum?id=ywxtmG1nU_6.
  21. Protein interaction interface region prediction by geometric deep learning. Bioinformatics, 2021.
  22. Protein interface prediction using graph convolutional networks. In Proceedings of the 31st International Conference on Neural Information Processing Systems, pages 6533–6542, 2017.
  23. Protein–protein interaction site prediction through combining local and global features with deep neural networks. Bioinformatics, 36(4):1114–1120, 2020.
  24. The protein data bank. Nucleic acids research, 28(1):235–242, 2000.
  25. Kendall Atkinson. An introduction to numerical analysis. John wiley & sons, 1991.
  26. Serge Bernstein. Sur l’ordre de la meilleure approximation des fonctions continues par des polynômes de degré donné, volume 4. Hayez, imprimeur des académies royales, 1912.
  27. CMU. Carnegie-mellon motion capture database. 2003. URL http://mocap.cs.cmu.edu.
  28. Machine learning of accurate energy-conserving molecular force fields. Science advances, 3(5):e1603015, 2017.
  29. Se(3)-transformers: 3d roto-translation equivariant attention networks. In NeurIPS, 2020.
  30. Interaction networks for learning about objects, relations and physics. In NeurIPS, volume 29, 2016.
  31. Flexible neural representation for physics prediction. In NeurIPS, volume 31, 2018.
  32. Learning mesh-based simulation with graph networks. In ICLR, 2021. URL https://openreview.net/forum?id=roNqYL0_XP.
  33. On information and sufficiency. The annals of mathematical statistics, 22(1):79–86, 1951.
  34. Improved variational inference with inverse autoregressive flow. NeurIPS, 29, 2016.
  35. Ladder variational autoencoders. NeurIPS, 29, 2016.
  36. Neural relational inference for interacting systems. In ICML, pages 2688–2697, 2018a.
  37. Hamiltonian graph networks with ode integrators. arXiv preprint arXiv:1909.12790, 2019.
  38. Geometric and physical quantities improve E(3) equivariant message passing. In ICLR, 2022. URL https://openreview.net/forum?id=_xwr8gOBeV1.
  39. Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219, 2018.
  40. Neural ordinary differential equations. In NeurIPS, pages 6572–6583, 2018.
  41. Graph neural ordinary differential equations. arXiv preprint arXiv:1911.07532, 2019.
  42. Carl Runge. Über empirische funktionen und die interpolation zwischen äquidistanten ordinaten. Zeitschrift für Mathematik und Physik, 46(224-243):20, 1901.
  43. A learning algorithm for continually running fully recurrent neural networks. Neural computation, 1, 1989.
  44. Learning to exploit long-term relational dependencies in knowledge graphs. In ICML, 2019.
  45. Long short-term memory. Neural Computation, 9, 1997.
  46. Exploring the limits of language modeling. In ICLR, 2016.
  47. Grammar as a foreign language. In NeurIPS, 2015.
  48. Isaac Newton. The Principia: mathematical principles of natural philosophy. Univ of California Press, 1999.
  49. Eqmotion: Equivariant multi-agent motion prediction with invariant interaction reasoning. In CVPR, pages 1410–1420. IEEE, 2023.
  50. Neural relational inference for interacting systems. In ICML, volume 80 of Proceedings of Machine Learning Research, pages 2693–2702. PMLR, 2018b.
  51. Groupnet: Multiscale hypergraph neural networks for trajectory prediction with relational reasoning. In CVPR, pages 6488–6497. IEEE, 2022.
  52. Roto-translated local coordinate frames for interacting dynamical systems. In NeurIPS, pages 6417–6429, 2021.
  53. Adam: A method for stochastic optimization. In ICLR, 2015.
  54. Layer normalization. CoRR, abs/1607.06450, 2016.
  55. Rectified linear units improve restricted boltzmann machines. In ICML, 2010.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.