Hypergraph-MLP: Learning on Hypergraphs without Message Passing (2312.09778v3)
Abstract: Hypergraphs are vital in modelling data with higher-order relations containing more than two entities, gaining prominence in machine learning and signal processing. Many hypergraph neural networks leverage message passing over hypergraph structures to enhance node representation learning, yielding impressive performances in tasks like hypergraph node classification. However, these message-passing-based models face several challenges, including oversmoothing as well as high latency and sensitivity to structural perturbations at inference time. To tackle those challenges, we propose an alternative approach where we integrate the information about hypergraph structures into training supervision without explicit message passing, thus also removing the reliance on it at inference. Specifically, we introduce Hypergraph-MLP, a novel learning framework for hypergraph-structured data, where the learning model is a straightforward multilayer perceptron (MLP) supervised by a loss function based on a notion of signal smoothness on hypergraphs. Experiments on hypergraph node classification tasks demonstrate that Hypergraph-MLP achieves competitive performance compared to existing baselines, and is considerably faster and more robust against structural perturbations at inference.
- “What are higher-order networks?,” SIAM Review, vol. 65, no. 3, pp. 686–731, 2023.
- “Groupnet: Multiscale hypergraph neural networks for trajectory prediction with relational reasoning,” in The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022.
- “Learning hypergraphs from signals with dual smoothness prior,” in ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2023, pp. 1–5.
- “Dynamic-group-aware networks for multi-agent trajectory prediction with relational reasoning,” Neural Networks, 2023.
- “Introducing hypergraph signal processing: Theoretical foundation and practical applications,” IEEE Internet of Things Journal, vol. 7, no. 1, pp. 639–660, 2019.
- “A survey on hypergraph representation learning,” ACM Computing Surveys.
- “Preventing over-smoothing for hypergraph neural networks,” arXiv preprint arXiv:2203.17159, 2022.
- “A survey on oversmoothing in graph neural networks,” arXiv preprint arXiv:2303.10993, 2023.
- “Graph-less neural networks: Teaching old MLPs new tricks via distillation,” in International Conference on Learning Representations, 2022.
- “Adversarial attack and defense on graph data: A survey,” IEEE Transactions on Knowledge and Data Engineering, 2022.
- “Hyperattack: Multi-gradient-guided white-box adversarial structure attack of hypergraph neural networks,” arXiv preprint arXiv:2302.12407, 2023.
- “Hypergraph structure inference from data under smoothness prior,” arXiv preprint arXiv:2308.14172, 2023.
- “You are allset: A multiset function framework for hypergraph neural networks,” in International Conference on Learning Representations, 2022.
- “Hypergcn: A new method for training graph convolutional networks on hypergraphs,” in Advances in Neural Information Processing Systems, H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alché-Buc, E. Fox, and R. Garnett, Eds. 2019, vol. 32, Curran Associates, Inc.
- “Hypergraph neural networks,” in Proceedings of the AAAI conference on artificial intelligence, 2019, vol. 33, pp. 3558–3565.
- “Hypergraph convolution and hypergraph attention,” Pattern Recognition, vol. 110, pp. 107637, 2021.
- “Unignn: a unified framework for graph and hypergraph neural networks,” in Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, 2021.
- “Learning mlps on graphs: A unified view of effectiveness, robustness, and efficiency,” in The Eleventh International Conference on Learning Representations, 2022.
- “Graph-mlp: Node classification without message passing in graph,” arXiv preprint arXiv:2106.04051, 2021.
- Algebraic graph theory, vol. 207, Springer Science & Business Media, 2001.
- “Connection and separation in hypergraphs,” Theory and Applications of Graphs, vol. 2, no. 2, pp. 5, 2015.
- “Large scale graph learning from smooth signals,” arXiv preprint arXiv:1710.05654, 2017.
- “Learning laplacian matrix in smooth graph signal representations,” IEEE Transactions on Signal Processing, vol. 64, no. 23, pp. 6160–6173, 2016.
- “Learning to learn graph topologies,” Advances in Neural Information Processing Systems, vol. 34, pp. 4249–4262, 2021.
- “Handbook of mathematical functions with formulas, graphs, and mathematical tables,” 1988.
- “UCI machine learning repository,” 2017.
- “On visual similarity based 3d model retrieval,” in Computer graphics forum. Wiley Online Library, 2003, vol. 22, pp. 223–232.
- “Hypergraph clustering: from blockmodels to modularity,” arXiv preprint arXiv:2101.09611, 2021.