Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A quatum inspired neural network for geometric modeling (2401.01801v2)

Published 3 Jan 2024 in cs.LG, cs.AI, and physics.comp-ph

Abstract: By conceiving physical systems as 3D many-body point clouds, geometric graph neural networks (GNNs), such as SE(3)/E(3) equivalent GNNs, have showcased promising performance. In particular, their effective message-passing mechanics make them adept at modeling molecules and crystalline materials. However, current geometric GNNs only offer a mean-field approximation of the many-body system, encapsulated within two-body message passing, thus falling short in capturing intricate relationships within these geometric graphs. To address this limitation, tensor networks, widely employed by computational physics to handle manybody systems using high-order tensors, have been introduced. Nevertheless, integrating these tensorized networks into the message-passing framework of GNNs faces scalability and symmetry conservation (e.g., permutation and rotation) challenges. In response, we introduce an innovative equivariant Matrix Product State (MPS)-based message-passing strategy, through achieving an efficient implementation of the tensor contraction operation. Our method effectively models complex many-body relationships, suppressing mean-field approximations, and captures symmetries within geometric graphs. Importantly, it seamlessly replaces the standard message-passing and layer-aggregation modules intrinsic to geometric GNNs. We empirically validate the superior accuracy of our approach on benchmark tasks, including predicting classical Newton systems and quantum tensor Hamiltonian matrices. To our knowledge, our approach represents the inaugural utilization of parameterized geometric tensor networks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (81)
  1. Mari Carmen Bañuls. Tensor network algorithms: A route map. Annual Review of Condensed Matter Physics, 14:173–191, 2023.
  2. Jean Barbier. Statistical physics and approximate message-passing algorithms for sparse linear estimation problems in signal processing and coding theory, 2015.
  3. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  4. Matrix product states and projected entangled pair states: Concepts, symmetries, theorems. Reviews of Modern Physics, 93(4):045003, 2021.
  5. Many-body localization and the area law in two dimensions. Physical Review B, 106(18):L180201, 2022.
  6. Se (3) equivariant graph neural networks with complete local frames. In International Conference on Machine Learning, pp.  5583–5608. PMLR, 2022.
  7. A new perspective on building efficient and expressive 3d equivariant graph neural networks. arXiv preprint arXiv:2304.04757, 2023.
  8. Colloquium: Area laws for the entanglement entropy. Reviews of modern physics, 82(1):277, 2010.
  9. Efficient tensor network ansatz for high-dimensional quantum many-body problems. Physical Review Letters, 126(17), April 2021. ISSN 1079-7114. doi: 10.1103/physrevlett.126.170603. URL http://dx.doi.org/10.1103/PhysRevLett.126.170603.
  10. Se(3)-transformers: 3d roto-translation equivariant attention networks, 2020.
  11. Compressing deep neural networks by matrix product operators. ArXiv, abs/1904.06194, 2019. URL https://api.semanticscholar.org/CorpusID:118645758.
  12. Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning - Volume 70, ICML’17, pp.  1263–1272. JMLR.org, 2017a.
  13. Neural message passing for quantum chemistry. In International conference on machine learning, pp.  1263–1272. PMLR, 2017b.
  14. Supervised learning with generalized tensor networks. arXiv preprint arXiv:1806.05964, pp.  50, 2018.
  15. Tensor network approaches for learning non-linear dynamical laws. arXiv preprint arXiv:2002.12388, 2020.
  16. General framework for e(3)-equivariant neural network representation of density functional theory hamiltonian. Nature Communications, 14(1), May 2023. ISSN 2041-1723. doi: 10.1038/s41467-023-38468-8. URL http://dx.doi.org/10.1038/s41467-023-38468-8.
  17. Adaptive fourier neural operators: Efficient token mixers for transformers. arXiv preprint arXiv:2111.13587, 2021.
  18. Neural tangent kernel of matrix product states: Convergence and applications. arXiv preprint arXiv:2111.14046, 2021.
  19. Sequential dependency enhanced graph neural networks for session-based recommendations. In 2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA), pp.  1–10. IEEE, 2021.
  20. Hypernetworks. In International Conference on Learning Representations, 2017. URL https://openreview.net/forum?id=rkpACe1lx.
  21. Lora: Low-rank adaptation of large language models. arXiv preprint arXiv:2106.09685, 2021.
  22. Tensor product generation networks for deep nlp modeling. arXiv preprint arXiv:1709.09118, 2017.
  23. Towards quantum machine learning with tensor networks. Quantum Science and Technology, 4(2):024001, jan 2019a. doi: 10.1088/2058-9565/aaea94. URL https://dx.doi.org/10.1088/2058-9565/aaea94.
  24. Towards quantum machine learning with tensor networks. Quantum Science and technology, 4(2):024001, 2019b.
  25. Dmrg approach to optimizing two-dimensional tensor networks. arXiv preprint arXiv:1908.08833, 2019.
  26. Learning from protein structure with geometric vector perceptrons. arXiv preprint arXiv:2009.01411, 2020.
  27. On the expressive power of deep polynomial neural networks. Advances in neural information processing systems, 32, 2019.
  28. Tensor decompositions and applications. SIAM Review, 51(3):455–500, 2009. doi: 10.1137/07070111X. URL https://doi.org/10.1137/07070111X.
  29. Korbinian Kottmann. Investigating quantum many-body systems with tensor networks, machine learning and quantum computers, 2022.
  30. Rethinking graph transformers with spectral attention. Advances in Neural Information Processing Systems, 34:21618–21629, 2021.
  31. Statistical Physics, Optimization, Inference, and Message-Passing Algorithms: Lecture Notes of the Les Houches School of Physics: Special Issue, October 2013. Oxford University Press, 12 2015. ISBN 9780198743736. doi: 10.1093/acprof:oso/9780198743736.001.0001. URL https://doi.org/10.1093/acprof:oso/9780198743736.001.0001.
  32. Evolutionary topology search for tensor network decomposition. In International Conference on Machine Learning, 2020. URL https://api.semanticscholar.org/CorpusID:221081834.
  33. Alternating local enumeration (tnale): Solving tensor network structure search with fewer evaluations. In Proceedings of the 40th International Conference on Machine Learning, ICML’23. JMLR.org, 2023.
  34. Deep-learning density functional theory hamiltonian for efficient ab initio electronic-structure calculation. Nature Computational Science, 2(6):367–377, June 2022. ISSN 2662-8457. doi: 10.1038/s43588-022-00265-6. URL http://dx.doi.org/10.1038/s43588-022-00265-6.
  35. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
  36. Equiformer: Equivariant graph attention transformer for 3d atomistic graphs. arXiv preprint arXiv:2206.11990, 2022.
  37. Symmetry-informed geometric representation for molecules, proteins, and crystalline materials. In NeurIPS, 2023.
  38. Ye Liu and Michael K. Ng. Deep neural network compression by tucker decomposition with nonlinear response. Knowledge-Based Systems, 241:108171, 2022. ISSN 0950-7051. doi: https://doi.org/10.1016/j.knosys.2022.108171. URL https://www.sciencedirect.com/science/article/pii/S0950705122000326.
  39. Entanglement-based feature extraction by tensor network machine learning. Frontiers in Applied Mathematics and Statistics, 7, 2021. ISSN 2297-4687. doi: 10.3389/fams.2021.716044. URL https://www.frontiersin.org/articles/10.3389/fams.2021.716044.
  40. Invariant and equivariant graph networks. arXiv preprint arXiv:1812.09902, 2018.
  41. Residual matrix product state for machine learning. SciPost Physics, 14(6):142, 2023.
  42. Geometry-complete perceptron networks for 3d molecular graphs. arXiv preprint arXiv:2211.02504, 2022.
  43. A survey on machine learning techniques using quantum computing. In 2022 Fourth International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT), pp.  1–6, 2022. doi: 10.1109/ICERECT56837.2022.10059764.
  44. James O’ Neill. An overview of neural network compression. arXiv preprint arXiv:2006.03669, 2020.
  45. Quantum Computation and Quantum Information. Cambridge University Press, 2000.
  46. Marc Olive. Effective computation of so(3) and o(3) linear representation symmetry classes. Mathematics and Mechanics of Complex Systems, 2017. URL https://api.semanticscholar.org/CorpusID:119309818.
  47. Ivan V Oseledets. Tensor-train decomposition. SIAM Journal on Scientific Computing, 33(5):2295–2317, 2011.
  48. An introduction to convolutional neural networks. arXiv preprint arXiv:1511.08458, 2015.
  49. Contracting arbitrary tensor networks: General approximate algorithm and applications in graphical models and quantum circuit simulations. Physical Review Letters, 125(6), August 2020. ISSN 1079-7114. doi: 10.1103/physrevlett.125.060503. URL http://dx.doi.org/10.1103/PhysRevLett.125.060503.
  50. Efficient quantum circuit simulation by tensor network methods on modern gpus, 2023.
  51. Matrix product state representations. Quantum Info. Comput., 7(5):401–430, jul 2007. ISSN 1533-7146.
  52. Tensor network language model. arXiv preprint arXiv:1710.10248, 2017.
  53. Quantum convolutional neural networks (qcnn) using deep learning for computer vision applications. In 2021 International Conference on Recent Trends on Electronics, Information, Communication & Technology (RTEICT), pp.  728–734. IEEE, 2021.
  54. Tensor network contractions: methods and applications to quantum many-body systems. Springer Nature, 2020.
  55. E(n) equivariant graph neural networks. arXiv preprint arXiv:2102.09844, 2021.
  56. Schnet–a deep learning architecture for molecules and materials. The Journal of Chemical Physics, 148(24):241722, 2018.
  57. Equivariant message passing for the prediction of tensorial properties and molecular spectra. arXiv preprint arXiv:2102.03150, 2021.
  58. Alex Sherstinsky. Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network. Physica D: Nonlinear Phenomena, 404:132306, 2020.
  59. Quantum tangent kernel. arXiv preprint arXiv:2111.02951, 2021.
  60. Kevin Slagle. Quantum gauge networks: A new kind of tensor network. Quantum, 7:1113, 2023.
  61. Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219, 2018.
  62. Entangling solid solutions: Machine learning of tensor networks for materials property prediction, 2022.
  63. Supervised learning with tensor networks. Advances in neural information processing systems, 29, 2016.
  64. Masuo Suzuki. Generalized trotter’s formula and systematic approximants of exponential operators and inner derivations with applications to many-body problems. Communications in Mathematical Physics, 51:183–190, 1976. URL https://api.semanticscholar.org/CorpusID:121900332.
  65. Tensor field networks: Rotation- and translation-equivariant neural networks for 3d point clouds. ArXiv, abs/1802.08219, 2018. URL https://api.semanticscholar.org/CorpusID:3457605.
  66. Gated recurrent neural tensor network. In 2016 International Joint Conference on Neural Networks (IJCNN), pp.  448–455. IEEE, 2016.
  67. Spookynet: Learning force fields with electronic degrees of freedom and nonlocal effects. Nature communications, 12(1):7273, 2021.
  68. Attention is all you need, 2023.
  69. F. Verstraete and J. I. Cirac. Renormalization algorithms for quantum-many body systems in two and higher dimensions, 2004.
  70. Matrix product density operators: Simulation of finite-temperature and dissipative systems. Phys. Rev. Lett., 93:207204, Nov 2004. doi: 10.1103/PhysRevLett.93.207204. URL https://link.aps.org/doi/10.1103/PhysRevLett.93.207204.
  71. Matrix product states, projected entangled pair states, and variational renormalization group methods for quantum spin systems. Advances in Physics, 57, 07 2009. doi: 10.1080/14789940801912366.
  72. Semantic hilbert space for text representation learning. In The World Wide Web Conference, WWW ’19, pp.  3293–3299, New York, NY, USA, 2019. Association for Computing Machinery. ISBN 9781450366748. doi: 10.1145/3308558.3313516. URL https://doi.org/10.1145/3308558.3313516.
  73. Andreas Weichselbaum. Non-abelian symmetries in tensor networks: A quantum symmetry space approach. Annals of Physics, 327(12):2972–3047, 2012.
  74. Holger Wendland. Numerical Linear Algebra: An Introduction. Cambridge Texts in Applied Mathematics. Cambridge University Press, 2017. doi: 10.1017/9781316544938.
  75. Steven R White. Density-matrix algorithms for quantum renormalization groups. Physical review b, 48(14):10345, 1993.
  76. How powerful are graph neural networks? In International Conference on Learning Representations.
  77. Representation learning on graphs with jumping knowledge networks. In International conference on machine learning, pp.  5453–5462. PMLR, 2018.
  78. Spatial–temporal tensor graph convolutional network for traffic speed prediction. IEEE Transactions on Intelligent Transportation Systems, 24(1):92–103, 2022.
  79. Graph tensor networks: An intuitive framework for designing large-scale neural learning systems on multiple domains, 2023.
  80. A generalized language model in tensor space, 2019.
  81. Spatial–temporal deep tensor neural networks for large-scale urban network speed prediction. IEEE Transactions on Intelligent Transportation Systems, 21(9):3718–3729, 2019.

Summary

We haven't generated a summary for this paper yet.