Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Representation Learning on Heterophilic Graph with Directional Neighborhood Attention (2403.01475v1)

Published 3 Mar 2024 in cs.LG, cs.AI, and cs.SI

Abstract: Graph Attention Network (GAT) is one of the most popular Graph Neural Network (GNN) architecture, which employs the attention mechanism to learn edge weights and has demonstrated promising performance in various applications. However, since it only incorporates information from immediate neighborhood, it lacks the ability to capture long-range and global graph information, leading to unsatisfactory performance on some datasets, particularly on heterophilic graphs. To address this limitation, we propose the Directional Graph Attention Network (DGAT) in this paper. DGAT is able to combine the feature-based attention with the global directional information extracted from the graph topology. To this end, a new class of Laplacian matrices is proposed which can provably reduce the diffusion distance between nodes. Based on the new Laplacian, topology-guided neighbour pruning and edge adding mechanisms are proposed to remove the noisy and capture the helpful long-range neighborhood information. Besides, a global directional attention is designed to enable a topological-aware information propagation. The superiority of the proposed DGAT over the baseline GAT has also been verified through experiments on real-world benchmarks and synthetic data sets. It also outperforms the state-of-the-art (SOTA) models on 6 out of 7 real-world benchmark datasets.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In International Conference on Machine Learning, pp.  21–29. PMLR, 2019.
  2. On the bottleneck of graph neural networks and its practical implications. arXiv preprint arXiv:2006.05205, 2020.
  3. Diffwire: Inductive graph rewiring via the lov\\\backslash\’asz bound. arXiv preprint arXiv:2206.07369, 2022.
  4. Directional graph networks. In Proceedings of the International Conference on Machine Learning, pp.  748–758. PMLR, 2021.
  5. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003.
  6. Beyond low-frequency information in graph convolutional networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pp.  3950–3957, 2021.
  7. How attentive are graph attention networks? arXiv preprint arXiv:2105.14491, 2021.
  8. Adaptive universal generalized pagerank graph neural network. arXiv preprint arXiv:2006.07988, 2020.
  9. Diffusion maps. Applied and Computational Harmonic Analysis, 21(1):5–30, 2006.
  10. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. Proceedings of the national academy of sciences, 102(21):7426–7431, 2005.
  11. Gbk-gnn: Gated bi-kernel graph neural networks for modeling both homophily and heterophily. In Proceedings of the ACM Web Conference 2022, pp. 1550–1558, 2022.
  12. A generalization of transformer networks to graphs. arXiv preprint arXiv:2012.09699, 2020.
  13. Improving graph neural networks with learnable propagation operators. In International Conference on Machine Learning, pp. 9224–9245. PMLR, 2023.
  14. Hamilton, W. L. Graph representation learning. Synthesis Lectures on Artifical Intelligence and Machine Learning, 14(3):1–159, 2020.
  15. Inductive representation learning on large graphs. In Proceedings of the 31st International Conference on Neural Information Processing Systems, pp.  1025–1035, 2017.
  16. Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. Advances in Neural Information Processing Systems, 34, 2021.
  17. Heterogeneous graph transformer. In Proceedings of the web conference 2020, pp.  2704–2710, 2020.
  18. Mudiff: Unified diffusion for complete molecule generation. In The Second Learning on Graphs Conference, 2023.
  19. Homophily influences ranking of minorities in social networks. Scientific Reports, 8(1), jul 2018. doi: 10.1038/s41598-018-29405-7.
  20. Algebraic connectivity of graphs, with applications. Bachelor’s thesis, Brown University, 2016.
  21. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
  22. Rethinking graph transformers with spectral attention. Advances in Neural Information Processing Systems, 34:21618–21629, 2021.
  23. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  24. Attention models in graphs: A survey. ACM Transactions on Knowledge Discovery from Data (TKDD), 13(6):1–25, 2019.
  25. Finding global homophily in graph neural networks when meeting heterophily. In International Conference on Machine Learning, pp. 13242–13256. PMLR, 2022.
  26. New benchmarks for learning on non-homophilous graphs. arXiv preprint arXiv:2104.01404, 2021.
  27. Non-local graph neural networks. arXiv preprint arXiv:2005.14612, 2020.
  28. Break the ceiling: Stronger multi-scale deep graph convolutional networks. Advances in neural information processing systems, 32, 2019.
  29. Revisiting heterophily for graph neural networks. Advances in neural information processing systems, 35:1362–1375, 2022a.
  30. Complete the missing half: Augmenting aggregation filtering with diversification for graph convolutional networks. In NeurIPS 2022 Workshop: New Frontiers in Graph Learning, 2022b.
  31. When do we need graph neural networks for node classification? International Conference on Complex Networks and Their Applications, 2023a.
  32. When do graph neural networks help with node classification? investigating the impact of homophily principle on node distinguishability. Advances in Neural Information Processing Systems, 36, 2023b.
  33. Simplifying approach to node classification in graph neural networks. Journal of Computational Science, 62:101695, 2022.
  34. Birds of a feather: Homophily in social networks. Annual Review of Sociology, pp.  415–444, 2001.
  35. Attending to graph transformers. arXiv preprint arXiv:2302.04181, 2023.
  36. Diffusion maps, spectral clustering and eigenfunctions of fokker-planck operators. Advances in neural information processing systems, 18, 2005.
  37. Dropgnn: Random dropouts increase the expressiveness of graph neural networks. Advances in Neural Information Processing Systems, 34:21997–22009, 2021.
  38. Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287, 2020.
  39. A critical look at the evaluation of gnns under heterophily: are we really making progress? arXiv preprint arXiv:2302.11640, 2023.
  40. Clustering and embedding using commute times. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(11):1873–1890, 2007.
  41. Dropedge: Towards deep graph convolutional networks on node classification. arXiv Preprint arXiv:1907.10903, 2019.
  42. The graph neural network model. IEEE Transactions on Neural Networks, 20(1):61–80, 2008.
  43. Masked label prediction: Unified message passing model for semi-supervised classification, 2021.
  44. Understanding over-squashing and bottlenecks on graphs via curvature. arXiv preprint arXiv:2111.14522, 2021.
  45. Attention is all you need. Advances in Neural Information Processing Systems, 30, 2017.
  46. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
  47. Direct multi-hop attention based graph neural network. arXiv preprint arXiv:2009.14332, 2020.
  48. How powerful are spectral graph neural networks. In International Conference on Machine Learning, pp. 23341–23362. PMLR, 2022.
  49. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
  50. Graph transformer networks. Advances in neural information processing systems, 32, 2019.
  51. A consciousness-inspired planning agent for model-based reinforcement learning. Advances in neural information processing systems, 34:1569–1581, 2021.
  52. Graph neural networks for graphs with heterophily: A survey. arXiv preprint arXiv:2202.07082, 2022.
  53. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems, 33:7793–7804, 2020.
  54. Graph neural networks with heterophily. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pp.  11168–11176, 2021.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Qincheng Lu (11 papers)
  2. Jiaqi Zhu (28 papers)
  3. Sitao Luan (25 papers)
  4. Xiao-Wen Chang (34 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.