Graph Neural Networks with Diverse Spectral Filtering (2312.09041v3)
Abstract: Spectral Graph Neural Networks (GNNs) have achieved tremendous success in graph machine learning, with polynomial filters applied for graph convolutions, where all nodes share the identical filter weights to mine their local contexts. Despite the success, existing spectral GNNs usually fail to deal with complex networks (e.g., WWW) due to such homogeneous spectral filtering setting that ignores the regional heterogeneity as typically seen in real-world networks. To tackle this issue, we propose a novel diverse spectral filtering (DSF) framework, which automatically learns node-specific filter weights to exploit the varying local structure properly. Particularly, the diverse filter weights consist of two components -- A global one shared among all nodes, and a local one that varies along network edges to reflect node difference arising from distinct graph parts -- to balance between local and global information. As such, not only can the global graph characteristics be captured, but also the diverse local patterns can be mined with awareness of different node positions. Interestingly, we formulate a novel optimization problem to assist in learning diverse filters, which also enables us to enhance any spectral GNNs with our DSF framework. We showcase the proposed framework on three state-of-the-arts including GPR-GNN, BernNet, and JacobiConv. Extensive experiments over 10 benchmark datasets demonstrate that our framework can consistently boost model performance by up to 4.92% in node classification tasks, producing diverse filters with enhanced interpretability. Code is available at \url{https://github.com/jingweio/DSF}.
- Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 2623–2631.
- Uri Alon and Eran Yahav. 2021. On the bottleneck of graph neural networks and its practical implications. In International Conference on Learning Representations. https://openreview.net/forum?id=i80OPhOCVH2
- Breaking the limits of message passing graph neural networks. In International Conference on Machine Learning. PMLR, 599–608.
- Mikhail Belkin and Partha Niyogi. 2003. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation 15, 6 (2003), 1373–1396.
- Graph neural networks with convolutional arma filters. IEEE Transactions on Pattern Analysis and Machine Intelligence (2021).
- Beyond low-frequency information in graph convolutional networks. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 3950–3957.
- Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence (2022).
- Tianwen Chen and Raymond Chi-Wing Wong. 2020. Handling information loss of graph neural networks for session-based recommendation. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1172–1180.
- Adaptive universal generalized PageRank graph neural network. In International Conference on Learning Representations. https://openreview.net/forum?id=n6jl7fLxrP
- Fan R. K. Chung. 1996. Spectral graph theory.
- Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems 29 (2016).
- AdaGNN: Graph neural networks with adaptive frequency response filter. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 392–401.
- Position information in transformers: An overview. Computational Linguistics 48, 3 (2022), 733–763.
- Graph neural networks with learnable structural and positional representations. In International Conference on Learning Representations. https://openreview.net/forum?id=wTTjnvGphYj
- Predict then propagate: Graph neural networks meet personalized PageRank. In International Conference on Learning Representations (ICLR).
- Neural message passing for quantum chemistry. In International conference on machine learning. PMLR, 1263–1272.
- Alex Graves. 2012. Long short-term memory. Supervised sequence labelling with recurrent neural networks (2012), 37–45.
- ES-GNN: Generalizing graph neural networks beyond homophily with edge splitting. arXiv preprint arXiv:2205.13700 (2022).
- Dynamic neural networks: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence (2021).
- Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. Advances in Neural Information Processing Systems 34 (2021), 14239–14251.
- How much position information do convolutional neural networks encode? arXiv preprint arXiv:2001.08248 (2020).
- Anil K Jain and Richard C Dubes. 1988. Algorithms for clustering data. Prentice-Hall, Inc.
- Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
- Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR).
- Rongjie Lai and Stanley Osher. 2014. A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58, 2 (2014), 431–449.
- Deeper insights into graph convolutional networks for semi-supervised learning. In Thirty-Second AAAI conference on artificial intelligence.
- New benchmarks for learning on non-homophilous graphs. arXiv preprint arXiv:2104.01404 (2021).
- A piece-wise polynomial filtering approach for graph neural networks. arXiv preprint arXiv:2112.03499 (2021).
- Meta-weight graph neural network: Push the limits beyond global homophily. In Proceedings of the ACM Web Conference 2022. 1270–1280.
- Image-based recommendations on styles and substitutes. In Proceedings of the 38th international ACM SIGIR conference on research and development in information retrieval. 43–52.
- Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 4602–4609.
- The PageRank citation ranking: Bringing order to the web. Technical Report. Stanford InfoLab.
- Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287 (2020).
- Multi-scale attributed node embedding. Journal of Complex Networks 9, 2 (2021), cnab014.
- Collective classification in network data. AI magazine 29, 3 (2008), 93–93.
- Pitfalls of graph neural network evaluation. Relational Representation Learning Workshop, NeurIPS 2018 (2018).
- The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE signal processing magazine 30, 3 (2013), 83–98.
- Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining (2021).
- Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-sne. Journal of machine learning research 9, 11 (2008).
- Graph attention networks. International Conference on Learning Representations (2018). https://openreview.net/forum?id=rJXMpikCZ accepted as poster.
- Equivariant and stable positional encoding for more powerful graph neural networks. In International Conference on Learning Representations.
- Xiyuan Wang and Muhan Zhang. 2022. How powerful are spectral graph neural networks. In International Conference on Machine Learning. PMLR, 23341–23362.
- Boris Weisfeiler and Andrei Leman. 1968. The reduction of a graph to canonical form and the algebra which appears therein. NTI, Series 2, 9 (1968), 12–16.
- Simplifying graph convolutional networks. In International Conference on Machine Learning. PMLR, 6861–6871.
- A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems 32, 1 (2021), 4–24. https://doi.org/10.1109/TNNLS.2020.2978386
- How powerful are graph neural networks?. In International Conference on Learning Representations. https://openreview.net/forum?id=ryGs6iA5Km
- Diverse message passing for attribute with heterophily. Advances in Neural Information Processing Systems 34 (2021), 4751–4763.
- Graph neural networks beyond compromise between attribute and topology. In Proceedings of the ACM Web Conference 2022. 1127–1135.
- PA-GNN: Parameter-adaptive graph neural networks. ([n. d.]).
- Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33 (2020), 7793–7804.
- Interpreting and unifying graph neural networks with an optimization framework. In Proceedings of the Web Conference 2021. 1215–1226.