Rayleigh Quotient Graph Neural Networks for Graph-level Anomaly Detection (2310.02861v4)
Abstract: Graph-level anomaly detection has gained significant attention as it finds applications in various domains, such as cancer diagnosis and enzyme prediction. However, existing methods fail to capture the spectral properties of graph anomalies, resulting in unexplainable framework design and unsatisfying performance. In this paper, we re-investigate the spectral differences between anomalous and normal graphs. Our main observation shows a significant disparity in the accumulated spectral energy between these two classes. Moreover, we prove that the accumulated spectral energy of the graph signal can be represented by its Rayleigh Quotient, indicating that the Rayleigh Quotient is a driving factor behind the anomalous properties of graphs. Motivated by this, we propose Rayleigh Quotient Graph Neural Network (RQGNN), the first spectral GNN that explores the inherent spectral features of anomalous graphs for graph-level anomaly detection. Specifically, we introduce a novel framework with two components: the Rayleigh Quotient learning component (RQL) and Chebyshev Wavelet GNN with RQ-pooling (CWGNN-RQ). RQL explicitly captures the Rayleigh Quotient of graphs and CWGNN-RQ implicitly explores the spectral space of graphs. Extensive experiments on 10 real-world datasets show that RQGNN outperforms the best rival by 6.74% in Macro-F1 score and 1.44% in AUC, demonstrating the effectiveness of our framework. Our code is available at https://github.com/xydong127/RQGNN.
- Accurate learning of graph representations with graph multiset pooling. In ICLR, 2021.
- Rumor detection on social media with bi-directional graph convolutional networks. In AAAI, pp. 549–556, 2020.
- Specformer: Spectral graph neural networks meet transformers. In ICLR, 2023.
- Perturbation theory and the rayleigh quotient. Journal of Sound and Vibration, 330:2073–2078, 2011.
- Class-balanced loss based on effective number of samples. In CVPR, pp. 9268–9277, 2019.
- Convolutional neural networks on graphs with fast localized spectral filtering. In NeurIPS, pp. 3837–3845, 2016.
- Inductive representation learning on large graphs. In NeurIPS, pp. 1024–1034, 2017.
- Wavelets on graphs via spectral graph theory. Appl. Comput. Harmon. Anal., 30(2):129–150, 2011a.
- Wavelets on graphs via spectral graph theory. Applied and Computational Harmonic Analysis, 30:129–150, 2011b.
- G-mixup: Graph data augmentation for graph classification. In ICML, pp. 8230–8248, 2022.
- Total variation graph neural networks. In ICML, pp. 12445–12468, 2023.
- Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. In NeurIPS, pp. 14239–14251, 2021.
- Matrix analysis. Cambridge university press, 2012.
- High-order pooling for graph neural networks with tensor decomposition. In NeurIPS, 2022.
- Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
- Understanding attention and generalization in graph neural networks. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, December 8-14, 2019, Vancouver, BC, Canada, pp. 4204–4214, 2019.
- Spam review detection with graph convolutional networks. In CIKM, pp. 2703–2711, 2019.
- G22{}^{\mbox{2}}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPTcn: Graph gaussian convolution networks with concentrated graph filters. In ICML, 2022.
- Focal loss for dense object detection. In ICCV, pp. 2999–3007, 2017.
- Graph pooling for graph neural networks: Progress, challenges, and opportunities. In IJCAI, pp. 6712–6722, 2023.
- On size-oriented long-tailed graph classification of graph neural networks. In WWW, pp. 1506–1516, 2022.
- Revisiting heterophily for graph neural networks. In NeurIPS, 2022.
- Deep graph-level anomaly detection by glocal knowledge distillation. In WSDM, pp. 704–714, 2022.
- Tudataset: A collection of benchmark datasets for learning with graphs. In ICML Workshop GRL+, 2020.
- Graph-level anomaly detection via hierarchical memory networks. In ECML-PKDD, 2023.
- Graph stream classification using labeled and unlabeled graphs. In 29th IEEE International Conference on Data Engineering, ICDE 2013, Brisbane, Australia, April 8-12, 2013, pp. 398–409, 2013.
- Christophe Pierre. Comments on rayleigh’s quotient and perturbation theory for the eigenvalue problem. Journal of Applied Mechanics, 55:986–988, 1988.
- Raising the bar in graph-level anomaly detection. In IJCAI, pp. 2196–2203, 2022.
- GTheodore J. Rivlin. The Chebyshev polynomials. John Wiley & Sons, 1974.
- Causal attention for interpretable and generalizable graph classification. In SIGKDD, pp. 1696–1705, 2022.
- Rethinking graph neural networks for anomaly detection. In ICML, pp. 21076–21089, 2022.
- Graph attention networks. In ICLR, 2018.
- Faith: Few-shot graph classification with hierarchical task graphs. In IJCAI, pp. 2284–2290, 2022.
- Curgraph: Curriculum learning for graph classification. In WWW, pp. 1238–1248, 2021.
- Structural entropy guided graph hierarchical pooling. In ICML, pp. 24017–24030, 2022.
- Label-invariant augmentation for semi-supervised graph classification. In NeurIPS, 2022.
- Dual-discriminative graph neural network for imbalanced graph-level anomaly detection. In NeurIPS, 2022.
- On using classification datasets to evaluate graph outlier detection: Peculiar observations and new insights. Big Data, 11(3):151–180, 2021.