Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Simplicial Message Passing for Chemical Property Prediction (2307.05392v1)

Published 9 Jun 2023 in cond-mat.mtrl-sci, cs.LG, and physics.chem-ph

Abstract: Recently, message-passing Neural networks (MPNN) provide a promising tool for dealing with molecular graphs and have achieved remarkable success in facilitating the discovery and materials design with desired properties. However, the classical MPNN methods also suffer from a limitation in capturing the strong topological information hidden in molecular structures, such as nonisomorphic graphs. To address this problem, this work proposes a Simplicial Message Passing (SMP) framework to better capture the topological information from molecules, which can break through the limitation within the vanilla message-passing paradigm. In SMP, a generalized message-passing framework is established for aggregating the information from arbitrary-order simplicial complex, and a hierarchical structure is elaborated to allow information exchange between different order simplices. We apply the SMP framework within deep learning architectures for quantum-chemical properties prediction and achieve state-of-the-art results. The results show that compared to traditional MPNN, involving higher-order simplex can better capture the complex structure of molecules and substantially enhance the performance of tasks. The SMP-based model can provide a generalized framework for GNNs and aid in the discovery and design of materials with tailored properties for various applications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. Neural network potential energy surfaces for small molecules and reactions. Chemical Reviews, 121(16):10187–10217, 2020.
  2. Geometric deep learning on molecular representations. Nature Machine Intelligence, 3(12):1023–1032, 2021.
  3. Crystal structure prediction via deep learning. Journal of the American Chemical Society, 140(32):10158–10168, 2018.
  4. Matthew Kahle et al. Topology of random simplicial complexes: a survey. AMS Contemp. Math, 620:201–222, 2014.
  5. Convolutional networks on graphs for learning molecular fingerprints. Advances in neural information processing systems (NeurIPS), 28, 2015.
  6. Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems (NeurIPS), 30, 2017.
  7. Neural message passing for quantum chemistry. In International conference on machine learning (ICML), pages 1263–1272. PMLR, 2017.
  8. Spookynet: Learning force fields with electronic degrees of freedom and nonlocal effects. Nature communications, 12(1):7273, 2021.
  9. Schnet–a deep learning architecture for molecules and materials. The Journal of Chemical Physics, 148(24):241722, 2018.
  10. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  11. Graph attention networks. In International Conference on Learning Representations (ICLR), 2018.
  12. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
  13. High-fidelity potential energy surfaces for gas-phase and gas–surface scattering processes from machine learning. The Journal of Physical Chemistry Letters, 11(13):5120–5131, 2020.
  14. Large-scale atomic simulation via machine learning potentials constructed by global potential energy surface exploration. Accounts of Chemical Research, 53(10):2119–2129, 2020.
  15. Generalized neural-network representation of high-dimensional potential-energy surfaces. Physical review letters, 98(14):146401, 2007.
  16. Permutation invariant polynomial neural network approach to fitting potential energy surfaces. The Journal of chemical physics, 139(5):054112, 2013.
  17. Permutationally invariant polynomial basis for molecular energy surface fitting via monomial symmetrization. Journal of Chemical Theory and Computation, 6(1):26–34, 2010.
  18. Fitting potential energy surfaces with fundamental invariant neural network. ii. generating fundamental invariants for molecular systems with up to ten atoms. The Journal of Chemical Physics, 152(20):204307, 2020.
  19. Unexpected steric hindrance failure in the gas phase f- + (ch3)3ci sn2 reaction. Nature Communications, 13:4427, 07 2022.
  20. Deep potential: A general representation of a many-body potential energy surface. Communications in Computational Physics, 23(3), 2018.
  21. Embedded atom neural network potentials: Efficient and accurate machine learning with a physically inspired representation. The journal of physical chemistry letters, 10(17):4962–4967, 2019.
  22. Embedded-atom-method functions for the fcc metals cu, ag, au, ni, pd, pt, and their alloys. Physical review B, 33(12):7983, 1986.
  23. V. Zaverkin and J. Kästner. Gaussian moments as physically inspired molecular descriptors for accurate and scalable machine learning potentials. Journal of Chemical Theory and Computation, 16(8):5410–5421, 2020.
  24. Provably powerful graph networks. Advances in neural information processing systems (NeurIPS), 32, 2019.
  25. Chemi-net: a molecular graph convolutional network for accurate drug property prediction. International journal of molecular sciences, 20(14):3389, 2019.
  26. Junction tree variational autoencoder for molecular graph generation. In International conference on machine learning (ICML), pages 2323–2332. PMLR, 2018.
  27. Graphdta: Predicting drug–target binding affinity with graph neural networks. Bioinformatics, 37(8):1140–1147, 2021.
  28. Quantum-chemical insights from deep tensor neural networks. Nature communications, 8(1):13890, 2017.
  29. Equivariant message passing for the prediction of tensorial properties and molecular spectra. In International Conference on Machine Learning (ICML), pages 9377–9388. PMLR, 2021.
  30. Pushing the boundaries of molecular representation for drug discovery with the graph attention mechanism. Journal of medicinal chemistry, 63(16):8749–8760, 2020.
  31. A graph-convolutional neural network model for the prediction of chemical reactivity. Chemical science, 10(2):370–377, 2019.
  32. Compositionally restricted attention-based network for materials property predictions. Npj Computational Materials, 7(1):1–10, 2021.
  33. Graph convolutional neural networks with global attention for improved materials property prediction. Physical Chemistry Chemical Physics, 22(32):18141–18148, 2020.
  34. Networks beyond pairwise interactions: structure and dynamics. Physics Reports, 874:1–92, 2020.
  35. Matthew Kahle. Random geometric complexes. Discrete & Computational Geometry, 45:553–573, 2011.
  36. Homological connectivity of random 2-complexes. Combinatorica, 26(4):475–487, 2006.
  37. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.
  38. Simplicial neural networks. arXiv preprint arXiv:2010.03633, 2020.
  39. Topologynet: Topology based deep convolutional and multi-task neural networks for biomolecular property predictions. PLoS computational biology, 13(7):e1005690, 2017.
  40. Weisfeiler and lehman go topological: Message passing simplicial networks. In International Conference on Machine Learning (ICML), pages 1026–1037. PMLR, 2021.
  41. Weisfeiler and lehman go cellular: Cw networks. Advances in Neural Information Processing Systems, 34:2625–2640, 2021.
  42. Spherical message passing for 3d molecular graphs. In International Conference on Learning Representations (ICLR), 2022.
  43. Adam: A method for stochastic optimization. In Yoshua Bengio and Yann LeCun, editors, 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, 2015.
  44. SGDR: stochastic gradient descent with warm restarts. In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net, 2017.
  45. John S. Delaney. Esol: Estimating aqueous solubility directly from molecular structure. Journal of Chemical Information and Computer Sciences, 44(3):1000–1005, 2004.
  46. Jarmo Huuskonen. Estimation of aqueous solubility for a diverse set of organic compounds based on molecular topology. Journal of Chemical Information and Computer Sciences, 40(3):773–777, 2000.
  47. Online chemical modeling environment (ochem): web platform for data storage, model development and publishing of chemical information. Journal of Computer-Aided Molecular Design, 25:533–554, 2011.
  48. A self-attention based message passing neural network for predicting molecular lipophilicity and aqueous solubility. Journal of cheminformatics, 12(1):1–9, 2020.
  49. Improved prediction of aqueous solubility of novel compounds by going deeper with deep learning. Frontiers in oncology, 10:121, 2020.
  50. Machine learning of accurate energy-conserving molecular force fields. Science advances, 3(5):e1603015, 2017.
  51. Quantum chemistry structures and properties of 134 kilo molecules. Scientific data, 1(1):1–7, 2014.
  52. Enumeration of 166 billion organic small molecules in the chemical universe database gdb-17. Journal of chemical information and modeling, 52(11):2864–2875, 2012.
  53. Physnet: A neural network for predicting energies, forces, dipole moments, and partial charges. Journal of Chemical Theory and Computation, 15(6):3678–3693, 2019.

Summary

We haven't generated a summary for this paper yet.