Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Advancing Graph Neural Networks with HL-HGAT: A Hodge-Laplacian and Attention Mechanism Approach for Heterogeneous Graph-Structured Data (2403.06687v2)

Published 11 Mar 2024 in cs.LG and cs.CV

Abstract: Graph neural networks (GNNs) have proven effective in capturing relationships among nodes in a graph. This study introduces a novel perspective by considering a graph as a simplicial complex, encompassing nodes, edges, triangles, and $k$-simplices, enabling the definition of graph-structured data on any $k$-simplices. Our contribution is the Hodge-Laplacian heterogeneous graph attention network (HL-HGAT), designed to learn heterogeneous signal representations across $k$-simplices. The HL-HGAT incorporates three key components: HL convolutional filters (HL-filters), simplicial projection (SP), and simplicial attention pooling (SAP) operators, applied to $k$-simplices. HL-filters leverage the unique topology of $k$-simplices encoded by the Hodge-Laplacian (HL) operator, operating within the spectral domain of the $k$-th HL operator. To address computation challenges, we introduce a polynomial approximation for HL-filters, exhibiting spatial localization properties. Additionally, we propose a pooling operator to coarsen $k$-simplices, combining features through simplicial attention mechanisms of self-attention and cross-attention via transformers and SP operators, capturing topological interconnections across multiple dimensions of simplices. The HL-HGAT is comprehensively evaluated across diverse graph applications, including NP-hard problems, graph multi-label and classification challenges, and graph regression tasks in logistics, computer vision, biology, chemistry, and neuroscience. The results demonstrate the model's efficacy and versatility in handling a wide range of graph-based scenarios.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (69)
  1. J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, and M. Sun, “Graph neural networks: A review of methods and applications,” AI open, vol. 1, pp. 57–81, 2020.
  2. Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and S. Y. Philip, “A comprehensive survey on graph neural networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 32, no. 1, pp. 4–24, 2020.
  3. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in International Conference on Learning Representations, 2017.
  4. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, “Graph attention networks,” in International Conference on Learning Representations, 2018.
  5. P. Reiser, M. Neubert, A. Eberhard, L. Torresi, C. Zhou, C. Shao, H. Metni, C. van Hoesel, H. Schopmans, T. Sommer, et al., “Graph neural networks for materials science and chemistry,” Communications Materials, vol. 3, no. 1, p. 93, 2022.
  6. J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in International Conference on Machine Learning, pp. 1263–1272, PMLR, 2017.
  7. S.-G. Huang, J. Xia, L. Xu, and A. Qiu, “Spatio-temporal directed acyclic graph learning with attention mechanisms on brain functional time series and connectivity,” Medical Image Analysis, vol. 77, p. 102370, 2022.
  8. H. Cui, W. Dai, Y. Zhu, X. Kan, A. A. C. Gu, J. Lukemire, L. Zhan, L. He, Y. Guo, and C. Yang, “BrainGB: A benchmark for brain network analysis with graph neural networks,” IEEE Transactions on Medical Imaging, vol. 42, no. 2, pp. 493–506, 2022.
  9. H. Peng, H. Wang, B. Du, M. Z. A. Bhuiyan, H. Ma, J. Liu, L. Wang, Z. Yang, L. Du, S. Wang, et al., “Spatial temporal incidence dynamic graph neural networks for traffic flow forecasting,” Information Sciences, vol. 521, pp. 277–290, 2020.
  10. W. Jiang and J. Luo, “Graph neural network for traffic forecasting: A survey,” Expert Systems with Applications, vol. 207, p. 117921, 2022.
  11. C. Huang, H. Xu, Y. Xu, P. Dai, L. Xia, M. Lu, L. Bo, H. Xing, X. Lai, and Y. Ye, “Knowledge-aware coupled graph neural network for social recommendation,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 4115–4122, 2021.
  12. S. Wu, F. Sun, W. Zhang, X. Xie, and B. Cui, “Graph neural networks in recommender systems: a survey,” ACM Computing Surveys, vol. 55, no. 5, pp. 1–37, 2022.
  13. R. Hanocka, A. Hertz, N. Fish, R. Giryes, S. Fleishman, and D. Cohen-Or, “Meshcnn: a network with an edge,” ACM Transactions on Graphics (ToG), vol. 38, no. 4, pp. 1–12, 2019.
  14. S. Alonso-Monsalve, D. Douqa, C. Jesús-Valls, T. Lux, S. Pina-Otey, F. Sánchez, D. Sgalaberna, and L. H. Whitehead, “Graph neural network for 3d classification of ambiguities and optical crosstalk in scintillator-based neutrino detectors,” Physical Review D, vol. 103, no. 3, p. 032005, 2021.
  15. P. Pradhyumna and G. Shreya, “Graph neural network (GNN) in image and video understanding using deep learning for computer vision applications,” in International Conference on Electronics and Sustainable Communication Systems, pp. 1183–1189, IEEE, 2021.
  16. J. Peng, C.-S. Kim, and C.-C. J. Kuo, “Technologies for 3d mesh compression: A survey,” Journal of visual communication and image representation, vol. 16, no. 6, pp. 688–733, 2005.
  17. M. Farazi, Z. Yang, W. Zhu, P. Qiu, and Y. Wang, “TetCNN: Convolutional neural networks on tetrahedral meshes,” in International Conference on Information Processing in Medical Imaging, pp. 303–315, Springer, 2023.
  18. Y. LeCun, B. Boser, J. Denker, D. Henderson, R. Howard, W. Hubbard, and L. Jackel, “Handwritten digit recognition with a back-propagation network,” in Advances in Neural Information Processing Systems (D. Touretzky, ed.), vol. 2, Morgan-Kaufmann, 1989.
  19. K. Zhao, B. Duka, H. Xie, D. J. Oathes, V. Calhoun, and Y. Zhang, “A dynamic graph convolutional neural network framework reveals new insights into connectome dysfunctions in adhd,” NeuroImage, vol. 246, p. 118774, 2022.
  20. X. Li, Y. Zhou, N. Dvornek, M. Zhang, S. Gao, J. Zhuang, D. Scheinost, L. H. Staib, P. Ventola, and J. S. Duncan, “Braingnn: Interpretable brain graph neural network for fmri analysis,” Medical Image Analysis, vol. 74, p. 102233, 2021.
  21. X. Bresson and T. Laurent, “Residual gated graph convnets,” arXiv preprint arXiv:1711.07553, 2017.
  22. L. Rampášek, M. Galkin, V. P. Dwivedi, A. T. Luu, G. Wolf, and D. Beaini, “Recipe for a general, powerful, scalable graph transformer,” in Advances in Neural Information Processing Systems, vol. 35, pp. 14501–14515, Curran Associates, Inc., 2022.
  23. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in neural information processing systems, vol. 30, 2017.
  24. J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun, “Spectral networks and deep locally connected networks on graphs,” arXiv preprint arXiv:1312.6203, 2014.
  25. M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Advances in Neural Information Processing Systems, vol. 29, 2016.
  26. S.-G. Huang, M. K. Chung, A. Qiu, and A. D. N. Initiative, “Revisiting convolutional neural network on graphs with polynomial approximations of laplace–beltrami spectral filtering,” Neural Computing and Applications, vol. 33, pp. 13693–13704, 2021.
  27. X. Jiang, P. Ji, and S. Li, “CensNet: Convolution with edge-node switching in graph neural networks.,” in International Joint Conference on Artificial Intelligence, pp. 2656–2662, 2019.
  28. J. Jo, J. Baek, S. Lee, D. Kim, M. Kang, and S. J. Hwang, “Edge representation learning with hypergraphs,” in Advances in Neural Information Processing Systems, vol. 34, pp. 7534–7546, 2021.
  29. F. Monti, O. Shchur, A. Bojchevski, O. Litany, S. Günnemann, and M. M. Bronstein, “Dual-primal graph convolutional networks,” arXiv preprint arXiv:1806.00770, 2018.
  30. J. Huang, M. K. Chung, and A. Qiu, “Heterogeneous graph convolutional neural network via hodge-laplacian for brain functional data,” in International Conference on Information Processing in Medical Imaging, pp. 278–290, Springer, 2023.
  31. D. Grattarola, D. Zambon, F. M. Bianchi, and C. Alippi, “Understanding pooling in graph neural networks,” IEEE Transactions on Neural Networks and Learning Systems, 2022.
  32. Z. Ying, J. You, C. Morris, X. Ren, W. Hamilton, and J. Leskovec, “Hierarchical graph representation learning with differentiable pooling,” in Advances in Neural Information Processing Systems, vol. 31, 2018.
  33. H. Gao and S. Ji, “Graph u-nets,” in International Conference on Machine Learning, pp. 2083–2092, PMLR, 2019.
  34. H. Gao, Y. Liu, and S. Ji, “Topology-aware graph pooling networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 12, pp. 4512–4518, 2021.
  35. F. M. Bianchi, D. Grattarola, and C. Alippi, “Spectral clustering with graph neural networks for graph pooling,” in International Conference on Machine Learning, pp. 874–883, PMLR, 2020.
  36. H. Edelsbrunner, D. Letscher, and A. Zomorodian, “Topological persistence and simplification,” in Annual Symposium on Foundations of Computer Science, pp. 454–463, IEEE, 2000.
  37. H. Lee, M. K. Chung, H. Kang, and D. S. Lee, “Hole detection in metabolic connectivity of alzheimer’s disease using k- laplacian,” in Medical Image Computing and Computer-Assisted Intervention, pp. 297–304, Springer, 2014.
  38. M. Tan and A. Qiu, “Spectral laplace-beltrami wavelets with applications in medical images,” IEEE Transactions on Medical Imaging, vol. 34, no. 5, pp. 1005–1017, 2014.
  39. Cambridge University Press, New York, NY, 2010-05-12 00:05:00 2010.
  40. S.-G. Huang, I. Lyu, A. Qiu, and M. K. Chung, “Fast polynomial approximation of heat kernel convolution on manifolds and its application to brain sulcal and gyral graph pattern analysis,” IEEE Transactions on Medical Imaging, vol. 39, no. 6, pp. 2201–2212, 2020.
  41. I. S. Dhillon, Y. Guan, and B. Kulis, “Weighted graph cuts without eigenvectors a multilevel approach,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 11, pp. 1944–1957, 2007.
  42. V. P. Dwivedi, C. K. Joshi, A. T. Luu, T. Laurent, Y. Bengio, and X. Bresson, “Benchmarking graph neural networks,” Journal of Machine Learning Research, vol. 24, no. 43, pp. 1–48, 2023.
  43. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
  44. A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Kopf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, “Pytorch: An imperative style, high-performance deep learning library,” in Advances in Neural Information Processing Systems, vol. 32, 2019.
  45. M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch Geometric,” ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
  46. B. Gavish and S. C. Graves, “The travelling salesman problem and related problems,” 1978.
  47. D. S. Johnson and L. A. McGeoch, “The traveling salesman problem: A case study in local optimization,” Local Search in Combinatorial Optimization, vol. 1, no. 1, pp. 215–310, 1997.
  48. T.-Y. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, “Focal loss for dense object detection,” in International Conference on Computer Vision, pp. 2980–2988, 2017.
  49. V. P. Dwivedi, L. Rampášek, M. Galkin, A. Parviz, G. Wolf, A. T. Luu, and D. Beaini, “Long range graph benchmark,” in Advances in Neural Information Processing Systems, vol. 35, pp. 22326–22340, 2022.
  50. D. Lu and Q. Weng, “A survey of image classification methods and techniques for improving classification performance,” International Journal of Remote Sensing, vol. 28, no. 5, pp. 823–870, 2007.
  51. A. Krizhevsky, G. Hinton, et al., “Learning multiple layers of features from tiny images,” 2009.
  52. R. Achanta, A. Shaji, K. Smith, A. Lucchi, P. Fua, and S. Süsstrunk, “Slic superpixels compared to state-of-the-art superpixel methods,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, no. 11, pp. 2274–2282, 2012.
  53. J. J. Irwin, T. Sterling, M. M. Mysinger, E. S. Bolstad, and R. G. Coleman, “Zinc: a free tool to discover chemistry for biology,” Journal of Chemical Information and Modeling, vol. 52, no. 7, pp. 1757–1768, 2012.
  54. S. Singh, K. Chaudhary, S. K. Dhanda, S. Bhalla, S. S. Usmani, A. Gautam, A. Tuknait, P. Agrawal, D. Mathur, and G. P. Raghava, “SATPdb: a database of structurally annotated therapeutic peptides,” Nucleic Acids Research, vol. 44, no. D1, pp. D1119–D1126, 2016.
  55. O. Wieder, S. Kohlbacher, M. Kuenemann, A. Garon, P. Ducrot, T. Seidel, and T. Langer, “A compact review of molecular property prediction with graph neural networks,” Drug Discovery Today: Technologies, vol. 37, pp. 1–12, 2020.
  56. K. M. Borgwardt, C. S. Ong, S. Schönauer, S. Vishwanathan, A. J. Smola, and H.-P. Kriegel, “Protein function prediction via graph kernels,” Bioinformatics, vol. 21, no. suppl_1, pp. i47–i56, 2005.
  57. H. Cumming and C. Rücker, “Octanol–water partition coefficient measurement by a simple 1h nmr method,” ACS Omega, vol. 2, no. 9, pp. 6244–6249, 2017.
  58. M. Işık, D. Levorse, D. L. Mobley, T. Rhodes, and J. D. Chodera, “Octanol–water partition coefficient measurements for the sampl6 blind prediction challenge,” Journal of Computer-Aided Molecular Design, vol. 34, no. 4, pp. 405–420, 2020.
  59. P. Ertl and A. Schuffenhauer, “Estimation of synthetic accessibility score of drug-like molecules based on molecular complexity and fragment contributions,” Journal of Cheminformatics, vol. 1, pp. 1–11, 2009.
  60. D. M. Barch, M. D. Albaugh, S. Avenevoli, L. Chang, D. B. Clark, M. D. Glantz, J. J. Hudziak, T. L. Jernigan, S. F. Tapert, D. Yurgelun-Todd, N. Alia-Klein, A. S. Potter, M. P. Paulus, D. Prouty, R. A. Zucker, and K. J. Sher, “Demographic, physical and mental health assessments in the adolescent brain and cognitive development study: Rationale and description,” Developmental Cognitive Neuroscience, vol. 32, pp. 55–66, 2018. The Adolescent Brain Cognitive Development (ABCD) Consortium: Rationale, Aims, and Assessment Strategy.
  61. D. S. Marcus, A. F. Fotenos, J. G. Csernansky, J. C. Morris, and R. L. Buckner, “Open access series of imaging studies: longitudinal mri data in nondemented and demented older adults,” Journal of Cognitive Neuroscience, vol. 22, no. 12, pp. 2677–2684, 2010.
  62. N. Akshoomoff, J. L. Beaumont, P. J. Bauer, S. S. Dikmen, R. C. Gershon, D. Mungas, J. Slotkin, D. Tulsky, S. Weintraub, P. D. Zelazo, et al., “Viii. nih toolbox cognition battery (cb): composite scores of crystallized, fluid, and overall cognition,” Monographs of the Society for Research in Child Development, vol. 78, no. 4, pp. 119–132, 2013.
  63. K. Franke and C. Gaser, “Ten years of brainage as a neuroimaging biomarker of brain aging: what insights have we gained?,” Frontiers in Neurology, p. 789, 2019.
  64. I. J. Deary, S. Strand, P. Smith, and C. Fernandes, “Intelligence and educational achievement,” Intelligence, vol. 35, no. 1, pp. 13–21, 2007.
  65. X. Shen, E. S. Finn, D. Scheinost, M. D. Rosenberg, M. M. Chun, X. Papademetris, and R. T. Constable, “Using connectome-based predictive modeling to predict individual behavior from brain connectivity,” Nature Protocols, vol. 12, no. 3, pp. 506–518, 2017.
  66. K. Nashiro, M. Sakaki, and M. Mather, “Age differences in brain activity during emotion processing: Reflections of age-related decline or increased emotion regulation,” Gerontology, vol. 58, no. 2, pp. 156–163, 2012.
  67. L. E. Bettio, L. Rajendran, and J. Gil-Mohapel, “The effects of aging in the hippocampus and cognitive decline,” Neuroscience & Biobehavioral Reviews, vol. 79, pp. 66–86, 2017.
  68. R. E. Jung and R. J. Haier, “The parieto-frontal integration theory (p-fit) of intelligence: converging neuroimaging evidence,” Behavioral and Brain Sciences, vol. 30, no. 2, pp. 135–154, 2007.
  69. M. Song, Y. Zhou, J. Li, Y. Liu, L. Tian, C. Yu, and T. Jiang, “Brain spontaneous functional connectivity and intelligence,” Neuroimage, vol. 41, no. 3, pp. 1168–1176, 2008.

Summary

We haven't generated a summary for this paper yet.