Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Atoms: Propagating Long-range Interaction in Molecular Graphs through Efficient Communication Channel (2311.01276v3)

Published 2 Nov 2023 in cs.LG and q-bio.QM

Abstract: Graph Neural Networks (GNNs) have been widely adopted for drug discovery with molecular graphs. Nevertheless, current GNNs mainly excel in leveraging short-range interactions (SRI) but struggle to capture long-range interactions (LRI), both of which are crucial for determining molecular properties. To tackle this issue, we propose a method to abstract the collective information of atomic groups into a few $\textit{Neural Atoms}$ by implicitly projecting the atoms of a molecular. Specifically, we explicitly exchange the information among neural atoms and project them back to the atoms' representations as an enhancement. With this mechanism, neural atoms establish the communication channels among distant nodes, effectively reducing the interaction scope of arbitrary node pairs into a single hop. To provide an inspection of our method from a physical perspective, we reveal its connection to the traditional LRI calculation method, Ewald Summation. The Neural Atom can enhance GNNs to capture LRI by approximating the potential LRI of the molecular. We conduct extensive experiments on four long-range graph benchmarks, covering graph-level and link-level tasks on molecular graphs. We achieve up to a 27.32% and 38.27% improvement in the 2D and 3D scenarios, respectively. Empirically, our method can be equipped with an arbitrary GNN to help capture LRI. Code and datasets are publicly available in https://github.com/tmlr-group/NeuralAtom.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (71)
  1. U. Alon and E. Yahav. On the bottleneck of graph neural networks and its practical implications. In ICLR, 2020.
  2. Layer normalization. In arXiv, 2016.
  3. Accurate learning of graph representations with graph multiset pooling. In ICLR, 2021.
  4. X. Bresson and T. Laurent. Residual gated graph convnets. In arXiv, 2017.
  5. On the connection between mpnn and graph transformer. ICML, 2023.
  6. Simple and deep graph convolutional networks. In ICML, 2020.
  7. Does invariant graph learning via environment augmentation learn invariance? In NeurIPS, 2023.
  8. On self-distilling graph neural network. In IJCAI, 2021.
  9. Rethinking attention with performers. In ICLR, 2021.
  10. The effect of long-range electrostatic interactions in simulations of macromolecular crystals–a comparison of the ewald and truncated list methods. J. Chem. Phys, 99(10):10089, 1993.
  11. A closer look at distribution shifts and out-of-distribution generalization on graphs. In NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications, 2021.
  12. ASAP: Adaptive structure aware pooling for learning hierarchical graph representations. In AAAI, 2020.
  13. Crystal structure representations for machine learning models of formation energies. IJQC, 115(16):1094–1101, 2015.
  14. Diffusion improves graph learning. In NeurIPS, 2019.
  15. Fast and uncertainty-aware directional message passing for non-equilibrium molecules. In NeurIPS, 2020.
  16. Neural message passing for quantum chemistry. In ICML, 2017.
  17. M. Gromiha and S. Selvaraj. Importance of long-range interactions in protein folding. Biophysical chemistry, 77(1):49–68, 1999.
  18. S. Harvey. Treatment of electrostatic effects in macromolecular modeling. Proteins: Structure, Function, and Bioinformatics, 5(1):78–92, 1989.
  19. Open graph benchmark: Datasets for machine learning on graphs. NeurIPS, 2020a.
  20. Strategies for pre-training graph neural networks. In ICLR, 2020b.
  21. An analysis of virtual nodes in graph neural networks for link prediction (extended abstract). In LoG, 2022.
  22. T. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. In ICLR, 2016.
  23. Wilds: A benchmark of in-the-wild distribution shifts. In ICML, 2021.
  24. Ewald-based long-range message passing for molecular graphs. In ICML, 2023.
  25. Rethinking graph transformers with spectral attention. In NeurIPS, 2021.
  26. Recipe for a general, powerful, scalable graph transformer. In NeurIPS, 2022.
  27. D. Leeson and J. Young. Molecular property design: does everyone get it?, 2015.
  28. Deep learning methods for molecular representation and property prediction. Drug Discovery Today, pp.  103373, 2022.
  29. Pre-training molecular graph representation with 3d geometry. In ICLR, 2022a.
  30. Local augmentation for graph neural networks. In ICML, 2022b.
  31. Fusionretro: molecule representation fusion via in-context learning for retrosynthetic planning. In ICML, 2023.
  32. Gradient-norm based attentive loss for molecular property prediction. In 2021 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), pp.  497–502. IEEE, 2021.
  33. Cross-dependent graph neural networks for molecular property prediction. Bioinformatics, 38(7):2003–2009, 2022.
  34. Graph convolutional networks with eigenpooling. In KDD, 2019.
  35. Transformer for graphs: An overview from architecture perspective. In arXiv, 2022.
  36. Tudataset: A collection of benchmark datasets for learning with graphs. In ICML Workshop on Graph Representation Learning and Beyond, 2020.
  37. Self-supervised graph transformer on large-scale molecular data. In NeurIPS, 2020a.
  38. Dropedge: Towards deep graph convolutional networks on node classification. In ICLR, 2020b.
  39. C. Sagui and T. Darden. Molecular dynamics simulations of biomolecules: long-range electrostatic effects. Annual review of biophysics and biomolecular structure, 28(1):155–179, 1999.
  40. Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. NeurIPS, 30, 2017.
  41. Equivariant message passing for the prediction of tensorial properties and molecular spectra. In ICML, 2021.
  42. Pitfalls of graph neural network evaluation. In NeurIPS Workshop on Relational Representation Learning, 2018.
  43. Atomic structures and orbital energies of 61,489 crystal-forming organic molecules. Scientific data, 7(1):58, 2020.
  44. 3d infomax improves gnns for molecular property prediction. In arXiv, 2021.
  45. Graph neural networks designed for different graph types: A survey. Transactions on Machine Learning Research, 2023. ISSN 2835-8856.
  46. Understanding over-squashing and bottlenecks on graphs via curvature. In ICLR, 2022.
  47. A. Toukmaji and J. Board Jr. Ewald summation techniques in perspective: a survey. Computer physics communications, 95(2-3):73–92, 1996.
  48. Attention is all you need. In NeurIPS, 2017.
  49. Graph neural networks with learnable structural and positional representations. In ICLR, 2022a.
  50. Long range graph benchmark. In NeurIPS Datasets and Benchmarks Track, 2022b.
  51. Learning to augment distributions for out-of-distribution detection. In NeurIPS, 2023a.
  52. Out-of-distribution detection with implicit outlier transformation. In ICLR, 2023b.
  53. Towards out-of-distribution generalizable predictions of chemical kinetics properties. In arXiv, 2023c.
  54. A compact review of molecular property prediction with graph neural networks. Drug Discovery Today: Technologies, 37:1–12, 2020.
  55. Topology of molecular interaction networks. BMC systems biology, 7:1–15, 2013.
  56. Molformer: Motif-based transformer on 3d heterogeneous molecular graphs. In AAAI, 2023.
  57. Prl-dock: Protein-ligand docking based on hydrogen bond matching and probabilistic relaxation labeling. Proteins: Structure, Function, and Bioinformatics, 80(9):2137–2153, 2012.
  58. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
  59. How powerful are graph neural networks? In ICLR, 2018.
  60. Do transformers really perform badly for graph representation? In NeurIPS, 2021.
  61. Hierarchical graph representation learning with differentiable pooling. In NeurIPS, 2018.
  62. Big bird: Transformers for longer sequences. NeurIPS, 2020.
  63. A deep potential model with long-range electrostatic interactions. The Journal of Chemical Physics, 156(12), 2022a.
  64. Artificial intelligence for science in quantum, atomistic, and continuum systems. In arXiv, 2023a.
  65. Efficient hyper-parameter search for knowledge graph embedding. In ACL (long paper), 2022b.
  66. A survey of drug-target interaction and affinity prediction methods via graph neural networks. Computers in Biology and Medicine, pp.  107136, 2023b.
  67. Adaprop: Learning adaptive propagation for graph neural network based knowledge graph reasoning. In KDD, 2023c.
  68. Graph neural network approaches for drug-target interactions. Current Opinion in Structural Biology, 73:102327, 2022c.
  69. Combating bilateral edge noise for robust link prediction. In NeurIPS, 2023a.
  70. On strengthening and defending graph reconstruction attack with markov chain approximation. In ICML, 2023b.
  71. Diversified outlier exposure for out-of-distribution detection via informative extrapolation. In NeurIPS, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xuan Li (129 papers)
  2. Zhanke Zhou (21 papers)
  3. Jiangchao Yao (74 papers)
  4. Yu Rong (146 papers)
  5. Lu Zhang (373 papers)
  6. Bo Han (282 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.