Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Rethinking Spectral Graph Neural Networks with Spatially Adaptive Filtering (2401.09071v5)

Published 17 Jan 2024 in cs.LG and cs.AI

Abstract: Whilst spectral Graph Neural Networks (GNNs) are theoretically well-founded in the spectral domain, their practical reliance on polynomial approximation implies a profound linkage to the spatial domain. As previous studies rarely examine spectral GNNs from the spatial perspective, their spatial-domain interpretability remains elusive, e.g., what information is essentially encoded by spectral GNNs in the spatial domain? In this paper, to answer this question, we establish a theoretical connection between spectral filtering and spatial aggregation, unveiling an intrinsic interaction that spectral filtering implicitly leads the original graph to an adapted new graph, explicitly computed for spatial aggregation. Both theoretical and empirical investigations reveal that the adapted new graph not only exhibits non-locality but also accommodates signed edge weights to reflect label consistency among nodes. These findings thus highlight the interpretable role of spectral GNNs in the spatial domain and inspire us to rethink graph spectral filters beyond the fixed-order polynomials, which neglect global information. Built upon the theoretical findings, we revisit the state-of-the-art spectral GNNs and propose a novel Spatially Adaptive Filtering (SAF) framework, which leverages the adapted new graph by spectral filtering for an auxiliary non-local aggregation. Notably, our proposed SAF comprehensively models both node similarity and dissimilarity from a global perspective, therefore alleviating persistent deficiencies of GNNs related to long-range dependencies and graph heterophily. Extensive experiments over 13 node classification benchmarks demonstrate the superiority of our proposed framework to the state-of-the-art models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (71)
  1. A spectral graph uncertainty principle. IEEE Transactions on Information Theory, 59(7):4338–4356, 2013.
  2. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp.  2623–2631, 2019.
  3. Bridging the gap between spectral and spatial domains in graph neural networks. arXiv preprint arXiv:2003.11702, 2020.
  4. Analyzing the expressive power of graph neural networks in a spectral perspective. In Proceedings of the International Conference on Learning Representations (ICLR), 2021.
  5. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation, 15(6):1373–1396, 2003.
  6. Graph neural networks with convolutional arma filters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.
  7. Beyond low-frequency information in graph convolutional networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pp.  3950–3957, 2021.
  8. Specformer: Spectral graph neural networks meet transformers. arXiv preprint arXiv:2303.01028, 2023.
  9. A note on sparse generalized eigenvalue problem. Advances in Neural Information Processing Systems, 34:23036–23048, 2021.
  10. Machine learning on graphs: A model and comprehensive taxonomy. Journal of Machine Learning Research, 23(89):1–64, 2022.
  11. Not all low-pass filters are robust in graph convolutional networks. Advances in Neural Information Processing Systems, 34:25058–25071, 2021.
  12. Simple and deep graph convolutional networks. In International conference on machine learning, pp. 1725–1735. PMLR, 2020.
  13. Bridging the gap between spatial and spectral domains: A unified framework for graph neural networks. arXiv preprint arXiv:2107.10234, 2021.
  14. Adaptive universal generalized pagerank graph neural network. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=n6jl7fLxrP.
  15. Fan R. K. Chung. Spectral graph theory. 1996.
  16. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  17. Adagnn: Graph neural networks with adaptive frequency response filter. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp.  392–401, 2021.
  18. Benchmarking graph neural networks. arXiv preprint arXiv:2003.00982, 2020.
  19. Pde-gcn: novel architectures for graph neural networks motivated by partial differential equations. Advances in neural information processing systems, 34:3836–3849, 2021.
  20. Rida T Farouki. The bernstein polynomial basis: A centennial retrospective. Computer Aided Geometric Design, 29(6):379–419, 2012.
  21. The uncertainty principle: A mathematical survey. Journal of Fourier analysis and applications, 3:207–238, 1997.
  22. A survey of graph neural networks for recommender systems: Challenges, methods, and directions. ACM Transactions on Recommender Systems, 1(1):1–51, 2023.
  23. Predict then propagate: Graph neural networks meet personalized pagerank. In International Conference on Learning Representations (ICLR), 2019.
  24. Neural message passing for quantum chemistry. In International conference on machine learning, pp. 1263–1272. PMLR, 2017.
  25. Graph neural networks with diverse spectral filtering. In Proceedings of the ACM Web Conference 2023, pp.  306–316, 2023.
  26. Graph neural networks with learnable and optimal polynomial bases. arXiv preprint arXiv:2302.12432, 2023a.
  27. Clenshaw graph neural networks. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp.  614–625, 2023b.
  28. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  29. Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. Advances in Neural Information Processing Systems, 34:14239–14251, 2021.
  30. Convolutional neural networks on graphs with chebyshev approximation, revisited. In NeurIPS, 2022.
  31. Werner Heisenberg. Über den anschaulichen inhalt der quantentheoretischen kinematik und mechanik. Zeitschrift für Physik, 43(3-4):172–198, 1927.
  32. Matrix analysis. Cambridge university press, 2012.
  33. Adaptive kernel graph neural network. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pp.  7051–7058, 2022.
  34. Pure transformers are powerful graph learners. Advances in Neural Information Processing Systems, 35:14582–14595, 2022.
  35. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  36. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR), 2017.
  37. Rethinking graph transformers with spectral attention. Advances in Neural Information Processing Systems, 34:21618–21629, 2021.
  38. Cornelius Lanczos. An iteration method for the solution of the eigenvalue problem of linear differential and integral operators. 1950.
  39. Cayleynets: Graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 67(1):97–109, 2018.
  40. Cast: A correlation-based adaptive spectral clustering algorithm on multi-scale data. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp.  439–449, 2020.
  41. Finding global homophily in graph neural networks when meeting heterophily. In International Conference on Machine Learning, pp. 13242–13256. PMLR, 2022.
  42. Lanczosnet: Multi-scale deep graph convolutional networks. arXiv preprint arXiv:1901.01484, 2019.
  43. New benchmarks for learning on non-homophilous graphs. arXiv preprint arXiv:2104.01404, 2021.
  44. Sign and basis invariant networks for spectral graph representation learning. arXiv preprint arXiv:2202.13013, 2022.
  45. Non-local graph neural networks. IEEE transactions on pattern analysis and machine intelligence, 44(12):10270–10276, 2021.
  46. A unified view on graph neural networks as graph signal denoising. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp.  1202–1211, 2021.
  47. Birds of a feather: Homophily in social networks. Annual review of sociology, 27(1):415–444, 2001.
  48. Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287, 2020.
  49. A critical look at the evaluation of gnns under heterophily: are we really making progress? arXiv preprint arXiv:2302.11640, 2023.
  50. Polynomials that are positive on an interval. Transactions of the American Mathematical Society, 352(10):4677–4692, 2000.
  51. Recipe for a general, powerful, scalable graph transformer. Advances in Neural Information Processing Systems, 35:14501–14515, 2022.
  52. Multi-scale attributed node embedding. Journal of Complex Networks, 9(2):cnab014, 2021.
  53. Ensemble learning: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(4):e1249, 2018.
  54. Collective classification in network data. AI magazine, 29(3):93–93, 2008.
  55. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE signal processing magazine, 30(3):83–98, 2013.
  56. Daniel Spielman. Spectral graph theory. Combinatorial scientific computing, 18:18, 2012.
  57. Feature expansion for graph neural networks. In International Conference on Machine Learning, pp. 33156–33176. PMLR, 2023.
  58. Social influence analysis in large-scale networks. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining, pp.  807–816, 2009.
  59. Longnn: Spectral gnns with learnable orthonormal basis. arXiv preprint arXiv:2303.13750, 2023.
  60. Equivariant and stable positional encoding for more powerful graph neural networks. In International Conference on Learning Representations, 2022.
  61. Graph structure estimation neural networks. In Proceedings of the Web Conference 2021, pp.  342–353, 2021.
  62. Am-gcn: Adaptive multi-channel graph convolutional networks. In Proceedings of the 26th ACM SIGKDD International conference on knowledge discovery & data mining, pp.  1243–1253, 2020.
  63. How powerful are spectral graph neural networks. In International Conference on Machine Learning, pp. 23341–23362. PMLR, 2022.
  64. Nodeformer: A scalable graph structure learning transformer for node classification. Advances in Neural Information Processing Systems, 35:27387–27401, 2022.
  65. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems, 32(1):4–24, 2021. doi: 10.1109/TNNLS.2020.2978386.
  66. Revisiting semi-supervised learning with graph embeddings. In International conference on machine learning, pp.  40–48. PMLR, 2016.
  67. Graph domain adaptation via theory-grounded spectral regularization. In The Eleventh International Conference on Learning Representations, 2022.
  68. Graph neural networks: A review of methods and applications. AI open, 1:57–81, 2020.
  69. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems, 33:7793–7804, 2020a.
  70. Interpreting and unifying graph neural networks with an optimization framework. In Proceedings of the Web Conference 2021, pp.  1215–1226, 2021.
  71. Graph geometry interaction learning. Advances in Neural Information Processing Systems, 33:7548–7558, 2020b.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jingwei Guo (23 papers)
  2. Kaizhu Huang (95 papers)
  3. Xinping Yi (63 papers)
  4. Zixian Su (8 papers)
  5. Rui Zhang (1140 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets