Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evolving Computation Graphs (2306.12943v1)

Published 22 Jun 2023 in cs.LG, cs.AI, cs.SI, and stat.ML

Abstract: Graph neural networks (GNNs) have demonstrated success in modeling relational data, especially for data that exhibits homophily: when a connection between nodes tends to imply that they belong to the same class. However, while this assumption is true in many relevant situations, there are important real-world scenarios that violate this assumption, and this has spurred research into improving GNNs for these cases. In this work, we propose Evolving Computation Graphs (ECGs), a novel method for enhancing GNNs on heterophilic datasets. Our approach builds on prior theoretical insights linking node degree, high homophily, and inter vs intra-class embedding similarity by rewiring the GNNs' computation graph towards adding edges that connect nodes that are likely to be in the same class. We utilise weaker classifiers to identify these edges, ultimately improving GNN performance on non-homophilic data as a result. We evaluate ECGs on a diverse set of recently-proposed heterophilous datasets and demonstrate improvements over the relevant baselines. ECG presents a simple, intuitive and elegant approach for improving GNN performance on heterophilic datasets without requiring prior domain knowledge.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In international conference on machine learning, pages 21–29. PMLR, 2019.
  2. Large-scale graph representation learning with very deep gnns and self-supervision. arXiv preprint arXiv:2107.09422, 2021.
  3. Towards combinatorial invariance for kazhdan-lusztig polynomials. arXiv preprint arXiv:2111.15161, 2021.
  4. Beyond low-frequency information in graph convolutional networks. arXiv preprint arXiv:2101.00797, 2021.
  5. Neural sheaf diffusion: A topological perspective on heterophily and oversmoothing in gnns. arXiv preprint arXiv:2202.04579, 2022.
  6. Geometric deep learning: Grids, groups, graphs, geodesics, and gauges. arXiv preprint arXiv:2104.13478, 2021.
  7. Adaptive universal generalized pagerank graph neural network. In International Conference on Learning Representations. https://openreview. net/forum, 2021.
  8. Advancing mathematics by guiding human intuition with ai. Nature, 600(7887):70–74, 2021.
  9. Eta prediction with graph neural networks in google maps. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pages 3767–3776, 2021.
  10. Gbk-gnn: Gated bi-kernel graph neural networks for modeling both homophily and heterophily. In Proceedings of the ACM Web Conference 2022, pages 1550–1558, 2022.
  11. Learning discrete structures for graph neural networks. In International conference on machine learning, pages 1972–1982. PMLR, 2019.
  12. Neural message passing for quantum chemistry. In International conference on machine learning, pages 1263–1272. PMLR, 2017.
  13. Bootstrap your own latent-a new approach to self-supervised learning. Advances in neural information processing systems, 33:21271–21284, 2020.
  14. Inductive representation learning on large graphs. arXiv, abs/1706.02216, 2017. URL http://arxiv.org/abs/1706.02216.
  15. Block modeling-guided graph convolutional neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 4022–4029, 2022.
  16. Gaussian error linear units (gelus). arXiv preprint arXiv:1606.08415, 2016.
  17. Ogb-lsc: A large-scale challenge for machine learning on graphs. arXiv preprint arXiv:2103.09430, 2021.
  18. Differentiable graph module (dgm) for graph convolutional networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(2):1606–1617, 2022.
  19. Semi-supervised classification with graph convolutional networks. arXiv, abs/1609.02907, 2016a. URL http://arxiv.org/abs/1609.02907.
  20. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016b.
  21. Finding global homophily in graph neural networks when meeting heterophily. arXiv preprint arXiv:2205.07308, 2022.
  22. Revisiting heterophily for graph neural networks. arXiv preprint arXiv:2210.07606, 2022.
  23. Is homophily a necessity for graph neural networks? arXiv preprint arXiv:2106.06134, 2021.
  24. Simplifying approach to node classification in graph neural networks. Journal of Computational Science, 62:101695, 2022.
  25. A graph placement methodology for fast chip design. Nature, 594(7862):207–212, 2021.
  26. Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287, 2020.
  27. Characterizing graph datasets for node classification: Beyond homophily-heterophily dichotomy. arXiv preprint arXiv:2209.06177, 2022.
  28. A critical look at the evaluation of gnns under heterophily: are we really making progress? arXiv preprint arXiv:2302.11640, 2023.
  29. Dropedge: Towards deep graph convolutional networks on node classification. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=Hkx1qkrKPr.
  30. Multi-scale attributed node embedding. Journal of Complex Networks, 9(2):cnab014, 2021.
  31. Collective classification in network data. AI magazine, 29(3):93–93, 2008.
  32. Masked label prediction: Unified message passing model for semi-supervised classification. arXiv preprint arXiv:2009.03509, 2020.
  33. A deep learning approach to antibiotic discovery. Cell, 180(4):688–702, 2020.
  34. Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns. arXiv preprint arXiv:2106.06586, 2021.
  35. Social influence analysis in large-scale networks. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 807–816, 2009.
  36. Large-scale representation learning on graphs via bootstrapping. In International Conference on Learning Representations, 2021.
  37. Laurens Van der Maaten and Geoffrey Hinton. Visualizing data using t-sne. Journal of machine learning research, 9(11), 2008.
  38. Graph attention networks. arXiv, abs/1710.10903, 2017.
  39. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
  40. Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315, 2019a.
  41. How powerful are spectral graph neural networks. In International Conference on Machine Learning, pages 23341–23362. PMLR, 2022.
  42. Dynamic graph cnn for learning on point clouds. Acm Transactions On Graphics (tog), 38(5):1–12, 2019b.
  43. Two sides of the same coin: Heterophily and oversmoothing in graph convolutional neural networks. arXiv preprint arXiv:2102.06462, 2021.
  44. Deep sets. Advances in neural information processing systems, 30, 2017.
  45. Graph neural networks with heterophily. arXiv preprint arXiv:2009.13566, 2020a.
  46. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems, 33, 2020b.
  47. Generalizing graph neural networks beyond homophily. arXiv preprint arXiv:2006.11468, 2020c.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Andreea Deac (15 papers)
  2. Jian Tang (327 papers)

Summary

We haven't generated a summary for this paper yet.