Conditional Local Feature Encoding for Graph Neural Networks (2405.04755v1)
Abstract: Graph neural networks (GNNs) have shown great success in learning from graph-based data. The key mechanism of current GNNs is message passing, where a node's feature is updated based on the information passing from its local neighbourhood. A limitation of this mechanism is that node features become increasingly dominated by the information aggregated from the neighbourhood as we use more rounds of message passing. Consequently, as the GNN layers become deeper, adjacent node features tends to be similar, making it more difficult for GNNs to distinguish adjacent nodes, thereby, limiting the performance of GNNs. In this paper, we propose conditional local feature encoding (CLFE) to help prevent the problem of node features being dominated by the information from local neighbourhood. The idea of our method is to extract the node hidden state embedding from message passing process and concatenate it with the nodes feature from previous stage, then we utilise linear transformation to form a CLFE based on the concatenated vector. The CLFE will form the layer output to better preserve node-specific information, thus help to improve the performance of GNN models. To verify the feasibility of our method, we conducted extensive experiments on seven benchmark datasets for four graph domain tasks: super-pixel graph classification, node classification, link prediction, and graph regression. The experimental results consistently demonstrate that our method improves model performance across a variety of baseline GNN models for all four tasks.
- Abbe, E. (2017). Community detection and stochastic block models: recent developments. The Journal of Machine Learning Research, 18, 6446–6531.
- Slic superpixels compared to state-of-the-art superpixel methods. IEEE transactions on pattern analysis and machine intelligence, 34, 2274–2282.
- Residual gated graph convnets. arXiv preprint arXiv:1711.07553, .
- Spectral networks and locally connected networks on graphs. In International Conference on Learning Representations (ICLR2014), CBLS, April 2014.
- Benchmarking graph neural networks. arXiv preprint arXiv:2003.00982, .
- Neural message passing for quantum chemistry. In International conference on machine learning (pp. 1263–1272). PMLR.
- Inductive representation learning on large graphs. In Advances in neural information processing systems (pp. 1024–1034).
- Hamilton, W. L. (2020). Graph representation learning. Synthesis Lectures on Artifical Intelligence and Machine Learning, 14, 1–159.
- Open graph benchmark: Datasets for machine learning on graphs. arXiv preprint arXiv:2005.00687, .
- Zinc: a free tool to discover chemistry for biology. Journal of chemical information and modeling, 52, 1757–1768.
- Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, .
- Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR2017).
- Learning multiple layers of features from tiny images, .
- Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86, 2278–2324.
- Deepgcns: Can gcns go as deep as cnns? In Proceedings of the IEEE/CVF international conference on computer vision (pp. 9267–9276).
- Deeper insights into graph convolutional networks for semi-supervised learning. In Proceedings of the AAAI Conference on Artificial Intelligence. volume 32.
- Graph neural networks with adaptive residual. Advances in Neural Information Processing Systems, 34, 9720–9733.
- Geometric deep learning on graphs and manifolds using mixture model cnns. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 5115–5124).
- Graph Attention Networks. International Conference on Learning Representations, . Accepted as poster.
- Deep graph library: Towards efficient and scalable deep learning on graphs. arXiv preprint arXiv:1909.01315, .
- Simplifying graph convolutional networks. In International conference on machine learning (pp. 6861–6871). PMLR.
- Optimization of graph neural networks: Implicit acceleration by skip connections and more depth. In International Conference on Machine Learning (pp. 11592–11602). PMLR.
- Learning graph representations through learning and propagating edge features. IEEE Transactions on Neural Networks and Learning Systems, .
- Randalign: A parameter-free method for regularizing graph convolutional networks. arXiv preprint arXiv:2404.09774, .
- Ssfg: Stochastically scaling features and gradients for regularizing graph convolutional networks. IEEE Transactions on Neural Networks and Learning Systems, .
- Model degradation ehinders deep graph neural networks. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (pp. 2493–2503).
- Yongze Wang (1 paper)
- Haimin Zhang (29 papers)
- Qiang Wu (154 papers)
- Min Xu (169 papers)