Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Neural Networks with Diverse Spectral Filtering (2312.09041v3)

Published 14 Dec 2023 in cs.LG and cs.SI

Abstract: Spectral Graph Neural Networks (GNNs) have achieved tremendous success in graph machine learning, with polynomial filters applied for graph convolutions, where all nodes share the identical filter weights to mine their local contexts. Despite the success, existing spectral GNNs usually fail to deal with complex networks (e.g., WWW) due to such homogeneous spectral filtering setting that ignores the regional heterogeneity as typically seen in real-world networks. To tackle this issue, we propose a novel diverse spectral filtering (DSF) framework, which automatically learns node-specific filter weights to exploit the varying local structure properly. Particularly, the diverse filter weights consist of two components -- A global one shared among all nodes, and a local one that varies along network edges to reflect node difference arising from distinct graph parts -- to balance between local and global information. As such, not only can the global graph characteristics be captured, but also the diverse local patterns can be mined with awareness of different node positions. Interestingly, we formulate a novel optimization problem to assist in learning diverse filters, which also enables us to enhance any spectral GNNs with our DSF framework. We showcase the proposed framework on three state-of-the-arts including GPR-GNN, BernNet, and JacobiConv. Extensive experiments over 10 benchmark datasets demonstrate that our framework can consistently boost model performance by up to 4.92% in node classification tasks, producing diverse filters with enhanced interpretability. Code is available at \url{https://github.com/jingweio/DSF}.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 2623–2631.
  2. Uri Alon and Eran Yahav. 2021. On the bottleneck of graph neural networks and its practical implications. In International Conference on Learning Representations. https://openreview.net/forum?id=i80OPhOCVH2
  3. Breaking the limits of message passing graph neural networks. In International Conference on Machine Learning. PMLR, 599–608.
  4. Mikhail Belkin and Partha Niyogi. 2003. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation 15, 6 (2003), 1373–1396.
  5. Graph neural networks with convolutional arma filters. IEEE Transactions on Pattern Analysis and Machine Intelligence (2021).
  6. Beyond low-frequency information in graph convolutional networks. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 3950–3957.
  7. Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence (2022).
  8. Tianwen Chen and Raymond Chi-Wing Wong. 2020. Handling information loss of graph neural networks for session-based recommendation. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1172–1180.
  9. Adaptive universal generalized PageRank graph neural network. In International Conference on Learning Representations. https://openreview.net/forum?id=n6jl7fLxrP
  10. Fan R. K. Chung. 1996. Spectral graph theory.
  11. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems 29 (2016).
  12. AdaGNN: Graph neural networks with adaptive frequency response filter. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 392–401.
  13. Position information in transformers: An overview. Computational Linguistics 48, 3 (2022), 733–763.
  14. Graph neural networks with learnable structural and positional representations. In International Conference on Learning Representations. https://openreview.net/forum?id=wTTjnvGphYj
  15. Predict then propagate: Graph neural networks meet personalized PageRank. In International Conference on Learning Representations (ICLR).
  16. Neural message passing for quantum chemistry. In International conference on machine learning. PMLR, 1263–1272.
  17. Alex Graves. 2012. Long short-term memory. Supervised sequence labelling with recurrent neural networks (2012), 37–45.
  18. ES-GNN: Generalizing graph neural networks beyond homophily with edge splitting. arXiv preprint arXiv:2205.13700 (2022).
  19. Dynamic neural networks: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence (2021).
  20. Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. Advances in Neural Information Processing Systems 34 (2021), 14239–14251.
  21. How much position information do convolutional neural networks encode? arXiv preprint arXiv:2001.08248 (2020).
  22. Anil K Jain and Richard C Dubes. 1988. Algorithms for clustering data. Prentice-Hall, Inc.
  23. Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
  24. Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR).
  25. Rongjie Lai and Stanley Osher. 2014. A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58, 2 (2014), 431–449.
  26. Deeper insights into graph convolutional networks for semi-supervised learning. In Thirty-Second AAAI conference on artificial intelligence.
  27. New benchmarks for learning on non-homophilous graphs. arXiv preprint arXiv:2104.01404 (2021).
  28. A piece-wise polynomial filtering approach for graph neural networks. arXiv preprint arXiv:2112.03499 (2021).
  29. Meta-weight graph neural network: Push the limits beyond global homophily. In Proceedings of the ACM Web Conference 2022. 1270–1280.
  30. Image-based recommendations on styles and substitutes. In Proceedings of the 38th international ACM SIGIR conference on research and development in information retrieval. 43–52.
  31. Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 4602–4609.
  32. The PageRank citation ranking: Bringing order to the web. Technical Report. Stanford InfoLab.
  33. Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287 (2020).
  34. Multi-scale attributed node embedding. Journal of Complex Networks 9, 2 (2021), cnab014.
  35. Collective classification in network data. AI magazine 29, 3 (2008), 93–93.
  36. Pitfalls of graph neural network evaluation. Relational Representation Learning Workshop, NeurIPS 2018 (2018).
  37. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE signal processing magazine 30, 3 (2013), 83–98.
  38. Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining (2021).
  39. Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-sne. Journal of machine learning research 9, 11 (2008).
  40. Graph attention networks. International Conference on Learning Representations (2018). https://openreview.net/forum?id=rJXMpikCZ accepted as poster.
  41. Equivariant and stable positional encoding for more powerful graph neural networks. In International Conference on Learning Representations.
  42. Xiyuan Wang and Muhan Zhang. 2022. How powerful are spectral graph neural networks. In International Conference on Machine Learning. PMLR, 23341–23362.
  43. Boris Weisfeiler and Andrei Leman. 1968. The reduction of a graph to canonical form and the algebra which appears therein. NTI, Series 2, 9 (1968), 12–16.
  44. Simplifying graph convolutional networks. In International Conference on Machine Learning. PMLR, 6861–6871.
  45. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems 32, 1 (2021), 4–24. https://doi.org/10.1109/TNNLS.2020.2978386
  46. How powerful are graph neural networks?. In International Conference on Learning Representations. https://openreview.net/forum?id=ryGs6iA5Km
  47. Diverse message passing for attribute with heterophily. Advances in Neural Information Processing Systems 34 (2021), 4751–4763.
  48. Graph neural networks beyond compromise between attribute and topology. In Proceedings of the ACM Web Conference 2022. 1127–1135.
  49. PA-GNN: Parameter-adaptive graph neural networks. ([n. d.]).
  50. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33 (2020), 7793–7804.
  51. Interpreting and unifying graph neural networks with an optimization framework. In Proceedings of the Web Conference 2021. 1215–1226.
Citations (9)

Summary

  • The paper introduces a Diverse Spectral Filtering framework that learns node-specific filter weights to capture both global and local graph features.
  • It employs an optimization method by embedding graph vertices into a low-dimensional space to adjust homogeneous spectral filters to diverse local contexts.
  • Experiments show the DSF framework significantly improves node classification accuracy and offers deeper interpretability on complex, non-uniform graphs.

Understanding Graph Neural Networks with Diverse Spectral Filtering

Introduction to Spectral GNNs

Graph Neural Networks (GNNs) have become a critical tool for learning from graph-structured data across various fields, from social networks to recommendation systems. Among them, spectral GNNs play a pivotal role. They operate by transforming input features into a new space through a process akin to convolution operations in the spectral domain. Despite their success, traditional spectral GNNs encounter difficulties when dealing with complex networks due to their one-size-fits-all approach in spectral filtering, which overlooks the uniqueness of the local structures in the network.

Diverse Spectral Filtering Framework

To address these limitations, a novel framework, Diverse Spectral Filtering (DSF), is introduced. Its core idea is to learn node-specific filter weights that can accommodate the diversity of local structures across different regions of the graph. The DSF framework achieves this by combining a globally shared filter with locally varying filter weights that reflect node differences driven by distinct parts of the graph. This balancing act enables the DSF framework to capture both global characteristics of the graph and the nuanced variations across different nodes, leading to enhanced interpretability and consistently improved performance in node classification tasks.

Optimizing Spectral GNNs

To optimize the learning of diverse filters, a new optimization problem is designed. It starts with embedding graph vertices into a low-dimensional space that encodes the positional information of nodes. Building upon this, node-specific coefficients are determined in a way that adjusts the original homogeneous spectral filters to the node’s local context. This refinement process creates a graph filter that is sensitive to individual nodes' unique positions, facilitating a more nuanced understanding of their diverse local contexts.

Enhancement of Existing Models and Interpretability

By plugging the DSF framework into existing spectral GNNs, it's been shown to boost model performance significantly, notably on datasets with non-uniform graphs. Experiments conducted across multiple benchmark datasets confirm that the DSF framework not only enhances model accuracy but also provides improved interpretability. It's capable of differentiating between the global structure and the local peculiarities of nodes positioned in diverse graph regions.

Conclusion

The DSF framework represents a significant advancement in spectral GNNs. By breaking free from the constraints of homogeneous filtering and embracing the diversity of local graph structures, DSF sets a new standard in graph machine learning, offering superior performance and richer interpretative insights. This groundbreaking framework is available for the community to experiment with and integrate into their own spectral GNN models, promising a new horizon for research and applications in graph-based learning systems.

Github Logo Streamline Icon: https://streamlinehq.com