Automated Graph Machine Learning: Approaches, Libraries, Benchmarks and Directions (2201.01288v2)
Abstract: Graph machine learning has been extensively studied in both academic and industry. However, as the literature on graph learning booms with a vast number of emerging methods and techniques, it becomes increasingly difficult to manually design the optimal machine learning algorithm for different graph-related tasks. To tackle the challenge, automated graph machine learning, which aims at discovering the best hyper-parameter and neural architecture configuration for different graph tasks/data without manual design, is gaining an increasing number of attentions from the research community. In this paper, we extensively discuss automated graph machine learning approaches, covering hyper-parameter optimization (HPO) and neural architecture search (NAS) for graph machine learning. We briefly overview existing libraries designed for either graph machine learning or automated machine learning respectively, and further in depth introduce AutoGL, our dedicated and the world's first open-source library for automated graph machine learning. Also, we describe a tailored benchmark that supports unified, reproducible, and efficient evaluations. Last but not least, we share our insights on future research directions for automated graph machine learning. This paper is the first systematic and comprehensive discussion of approaches, libraries as well as directions for automated graph machine learning.
- P. Cui, X. Wang, J. Pei, and W. Zhu, “A survey on network embedding,” IEEE transactions on knowledge and data engineering, vol. 31, no. 5, pp. 833–852, 2018.
- W. L. Hamilton, R. Ying, and J. Leskovec, “Representation learning on graphs: Methods and applications,” IEEE DEBU, 2017.
- P. Goyal and E. Ferrara, “Graph embedding techniques, applications, and performance: A survey,” KBS, 2018.
- H. Cai, V. W. Zheng, and K. C.-C. Chang, “A comprehensive survey of graph embedding: Problems, techniques, and applications,” TKDE, 2018.
- Z. Zhang, P. Cui, and W. Zhu, “Deep learning on graphs: A survey,” TKDE, 2020.
- Z. Wu et al., “A comprehensive survey on graph neural networks,” TNNLS, 2020.
- J. Zhou et al., “Graph neural networks: A review of methods and applications,” arXiv:1812.08434, 2018.
- R. Ying et al., “Graph convolutional neural networks for web-scale recommender systems,” in KDD, 2018.
- J. Ma et al., “Learning disentangled representations for recommendation,” in NeurIPS, 2019.
- H. Li, X. Wang, Z. Zhang, J. Ma, P. Cui, and W. Zhu, “Intention-aware sequential recommendation with structured intent transition,” IEEE Transactions on Knowledge and Data Engineering, vol. 34, no. 11, pp. 5403–5414, 2021.
- S. Liu, Z. Meng, C. Macdonald, and I. Ounis, “Graph neural pre-training for recommendation with side information,” ACM Transactions on Information Systems, vol. 41, no. 3, pp. 1–28, 2023.
- H. Cui, J. Lu, Y. Ge, and C. Yang, “How can graph neural networks help document retrieval: A case study on cord19 with concept map generation,” in European Conference on Information Retrieval. Springer, 2022, pp. 75–83.
- C. Feng, Y. He, S. Wen, G. Liu, L. Wang, J. Xu, and B. Zheng, “Dc-gnn: Decoupled graph neural networks for improving and accelerating large-scale e-commerce retrieval,” in Companion Proceedings of the Web Conference 2022, 2022, pp. 32–40.
- L. Wang, X. Li, H. Zhang, Y. Dai, and S. Zhang, “Gnn-based retrieval and recommadation system: A semantic enhenced graph model,” in 2022 IEEE 5th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), vol. 5. IEEE, 2022, pp. 1823–1830.
- K. Liang, J. Tan, D. Zeng, Y. Huang, X. Huang, and G. Tan, “Abslearn: a gnn-based framework for aliasing and buffer-size information retrieval,” Pattern Analysis and Applications, pp. 1–19, 2023.
- L. Akoglu, H. Tong, and D. Koutra, “Graph based anomaly detection and description: a survey,” DMKD, 2015.
- C. Su et al., “Network embedding in biomedical data science,” Briefings in bioinformatics, 2020.
- M. Zitnik and J. Leskovec, “Predicting multicellular function through multi-layer tissue networks,” Bioinformatics, 2017.
- T. Kipf et al., “Neural relational inference for interacting systems,” ICML, 2018.
- Y. Li et al., “Diffusion convolutional recurrent neural network: Data-driven traffic forecasting,” in ICLR, 2018.
- B. Yu, H. Yin, and Z. Zhu, “Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting,” in IJCAI, 2018.
- Q. Wang et al., “Knowledge graph embedding: A survey of approaches and applications,” TKDE, 2017.
- V. N. Ioannidis, D. Zheng, and G. Karypis, “Few-shot link prediction via graph neural networks for covid-19 drug-repurposing,” arXiv:2007.10261, 2020.
- D. M. Gysi et al., “Network medicine framework for identifying drug repurposing opportunities for covid-19,” arXiv:2004.07229, 2020.
- A. Kapoor et al., “Examining covid-19 forecasting using spatio-temporal graph neural networks,” arXiv:2007.03113, 2020.
- X. He, K. Zhao, and X. Chu, “Automl: A survey of the state-of-the-art,” KBS, 2020.
- Q. Yao, M. Wang, Y. Chen, W. Dai, Y.-F. Li, W.-W. Tu, Q. Yang, and Y. Yu, “Taking human out of learning applications: A survey on automated machine learning,” arXiv:1810.13306, 2018.
- J. Bergstra and Y. Bengio, “Random search for hyper-parameter optimization,” JMLR, 2012.
- J. Bergstra et al., “Algorithms for hyper-parameter optimization,” NeurIPS, 2011.
- J. Snoek, H. Larochelle, and R. P. Adams, “Practical bayesian optimization of machine learning algorithms,” NeurIPS, 2012.
- Y. Liu, X. Wang, X. Xu, J. Yang, and W. Zhu, “Meta hyperparameter optimization with adversarial proxy subsets sampling,” in Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021, pp. 1109–1118.
- T. Elsken, J. H. Metzen, and F. Hutter, “Neural architecture search: A survey,” JMLR, 2019.
- Z. Wei, X. Wang, and W. Zhu, “Autoias: Automatic integrated architecture searcher for click-trough rate prediction,” in Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021, pp. 2101–2110.
- B. Zoph and Q. V. Le, “Neural architecture search with reinforcement learning,” in ICLR, 2017.
- H. Liu, K. Simonyan, and Y. Yang, “Darts: Differentiable architecture search,” in ICLR, 2018.
- H. Pham et al., “Efficient neural architecture search via parameters sharing,” in ICML, 2018.
- B. Zoph et al., “Learning transferable architectures for scalable image recognition,” in CVPR, 2018.
- E. Real et al., “Regularized evolution for image classifier architecture search,” in AAAI, 2019.
- J. Gilmer et al., “Neural message passing for quantum chemistry,” in ICML, 2017.
- M. M. Bronstein et al., “Geometric deep learning: going beyond euclidean data,” IEEE Signal Processing Magazine, 2017.
- Y. Gao et al., “Graph neural architecture search,” in IJCAI, 2020.
- W. Hu et al., “Open graph benchmark: Datasets for machine learning on graphs,” NeurIPS, 2020.
- C. Zang et al., “On power law growth of social networks,” TKDE, 2018.
- K. Tu et al., “Autone: Hyperparameter optimization for massive network embedding,” in KDD, 2019.
- X. Wang et al., “Explainable automated graph representation learning with hyperparameter importance,” in ICML, 2021.
- Z. Zhang et al., “Arbitrary-order proximity preserved network embedding,” in KDD, 2018.
- B. Perozzi, R. Al-Rfou, and S. Skiena, “Deepwalk: Online learning of social representations,” in KDD, 2014.
- T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in ICLR, 2017.
- M. Guo et al., “Jitune: Just-in-time hyperparameter tuning for network embedding algorithms,” arXiv:2101.06427, 2021.
- Y. Yuan et al., “A novel genetic algorithm with hierarchical evaluation strategy for hyperparameter optimisation of graph neural networks,” arXiv:2101.09300, 2021.
- M. Yoon et al., “Autonomous graph mining algorithm search with best speed/accuracy trade-off,” in IEEE ICDM, 2020.
- Y. Yuan, W. Wang, and W. Pang, “Which hyperparameters to optimise? an investigation of evolutionary hyperparameter optimisation in graph neural network for molecular property prediction,” in GECCO, 2021.
- ——, “A systematic comparison study on hyperparameter optimisation of graph neural networks for molecular property prediction,” in GECCO, 2021.
- R. Zhu, Z. Tao, Y. Li, and S. Li, “Automated graph learning via population based self-tuning gcn,” in Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2021, pp. 2096–2100.
- C. Bu, Y. Lu, and F. Liu, “Automatic graph learning with evolutionary algorithms: An experimental study,” in PRICAI, 2021.
- J. Sun, B. Wang, and B. Wu, “Automated graph representation learning for node classification,” in IJCNN, 2021.
- X. Yang, J. Wang, X. Zhao, S. Li, and Z. Tao, “Calibrate automated graph neural network via hyperparameter uncertainty,” in Proceedings of the 31st ACM International Conference on Information & Knowledge Management, 2022, pp. 4640–4644.
- O. Lloyd, Y. Liu, and T. R Gaunt, “Assessing the effects of hyperparameters on knowledge graph embedding quality,” Journal of big Data, vol. 10, no. 1, pp. 1–15, 2023.
- M. Yoon, T. Gervet, B. Hooi, and C. Faloutsos, “Autonomous graph mining algorithm search with best performance trade-off,” Knowledge and Information Systems, vol. 64, no. 6, pp. 1571–1602, 2022.
- Y. Zhang, Z. Zhou, Q. Yao, and Y. Li, “Kgtuner: Efficient hyper-parameter search for knowledge graph learning,” arXiv preprint arXiv:2205.02460, 2022.
- D. Yang, B. Qu, R. Hussein, P. Rosso, P. Cudré-Mauroux, and J. Liu, “Revisiting embedding based graph analyses: Hyperparameters matter!” IEEE Transactions on Knowledge and Data Engineering, 2022.
- K. Zhou et al., “Auto-gnn: Neural architecture search of graph neural networks,” arXiv:1909.03184, 2019.
- H. Zhao, L. Wei, and Q. Yao, “Simplifying architecture search for graph neural network,” arXiv:2008.11652, 2020.
- Y. Zhao et al., “Probabilistic dual network architecture search on graphs,” arXiv:2003.09676, 2020.
- M. Nunes et al., “Neural architecture search in graph neural networks,” in Brazilian Conference on Intelligent Systems, 2020.
- Y. Li and I. King, “Autograph: Automated graph neural network,” in ICONIP, 2020.
- M. Shi et al., “Evolutionary architecture search for graph neural networks,” arXiv:2009.10199, 2020.
- H. Zhao et al., “Efficient graph neural architecture search,” 2021. [Online]. Available: https://openreview.net/forum?id=IjIzIOkK2D6
- S. Jiang and P. Balaprakash, “Graph neural network architecture search for molecular property prediction,” in IEEE Big Data, 2020.
- Y. Zhao et al., “Learned low precision graph neural networks,” in EuroMLSys, 2021.
- J. You, Z. Ying, and J. Leskovec, “Design space for graph neural networks,” NeurIPS, 2020.
- G. Li et al., “Sgas: Sequential greedy architecture search,” in CVPR, 2020.
- W. Peng et al., “Learning graph convolutional network for skeleton-based human action recognition by neural searching,” AAAI, 2020.
- Pourchot and Sigaud, “CEM-RL: Combining evolutionary and gradient-based methods for policy search,” in ICLR, 2019.
- S. Cai et al., “Rethinking graph neural architecture search from message-passing,” CVPR, 2021.
- Z. Pan et al., “Autostg: Neural architecture search for predictions of spatio-temporal graphs,” WWW, 2021.
- Y. Li et al., “One-shot graph neural architecture search with dynamic search space,” AAAI, 2021.
- H. Zhao, Q. Yao, and W. Tu, “Search to aggregate neighborhood for graph neural network,” ICDE, 2021.
- C. Guan, X. Wang, and W. Zhu, “Autoattend: Automated attention representation search,” in ICML, 2021.
- Y. Ding et al., “Diffmg: Differentiable meta graph search for heterogeneous graph neural networks,” in KDD, 2021.
- G. Feng, C. Wang, and H. Wang, “Search for deep graph neural networks,” arXiv:2109.10047, 2021.
- L. Wei, H. Zhao, and Z. He, “Learn layer-wise connections in graph neural networks,” arXiv:2112.13585, 2021.
- C. Wang et al., “Fl-agcns: Federated learning framework for automatic graph convolutional network search,” arXiv:2104.04141, 2021.
- Y. Zhang, H. You, Y. Fu, T. Geng, A. Li, and Y. Lin, “G-cos: Gnn-accelerator co-search towards both better accuracy and efficiency,” in 2021 IEEE/ACM International Conference On Computer Aided Design (ICCAD). IEEE, 2021, pp. 1–9.
- L. Wei, H. Zhao, Q. Yao, and Z. He, “Pooling architecture search for graph classification,” in Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021, pp. 2091–2100.
- Q. Lu, W. Jiang, M. Jiang, J. Hu, S. Dasgupta, and Y. Shi, “Fgnas: Fpga-aware graph neural architecture search,” 2020.
- J. Chen et al., “Graphpas: Parallel architecture search for graph neural networks,” in SIGIR, 2021.
- R. Cai et al., “Algnn: Auto-designed lightweight graph neural network,” in PRICAI, 2021.
- C. C. Coello and M. S. Lechuga, “Mopso: A proposal for multiple objective particle swarm optimization,” in CEC, 2002.
- S. Cai et al., “Edge-featured graph neural architecture search,” arXiv:2109.01356, 2021.
- Z. Wang, S. Di, and L. Chen, “Autogel: An automated graph neural network with explicit link information,” NeurIPS, 2021.
- S. Xie et al., “Snas: stochastic neural architecture search,” ICLR, 2019.
- Y. Qin, X. Wang, Z. Zhang, and W. Zhu, “Graph differentiable architecture search with structure learning,” Advances in neural information processing systems, vol. 34, pp. 16 860–16 872, 2021.
- B. Xie, H. Chang, Z. Zhang, X. Wang, D. Wang, Z. Zhang, R. Ying, and W. Zhu, “Adversarially robust neural architecture search for graph neural networks,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 8143–8152.
- Y. Qin, X. Wang, Z. Zhang, P. Xie, and W. Zhu, “Graph neural architecture search under distribution shifts,” in International Conference on Machine Learning. PMLR, 2022, pp. 18 083–18 095.
- C. Guan, X. Wang, H. Chen, Z. Zhang, and W. Zhu, “Large-scale graph neural architecture search,” in International Conference on Machine Learning. PMLR, 2022, pp. 7968–7981.
- W. Zhang, Y. Shen, Z. Lin, Y. Li, X. Li, W. Ouyang, Y. Tao, Z. Yang, and B. Cui, “Pasca: A graph neural architecture search system under the scalable paradigm,” in Proceedings of the ACM Web Conference 2022, 2022, pp. 1817–1828.
- C. Li, H. Xu, and K. He, “Differentiable meta multigraph search with partial message propagation on heterogeneous information networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 7, 2023, pp. 8518–8526.
- Z. Zhang, Z. Zhang, X. Wang, Y. Qin, Z. Qin, and W. Zhu, “Dynamic heterogeneous graph attention neural architecture search,” AAAI, 2023.
- G. Li et al., “Deepgcns: Can gcns go as deep as cnns?” in ICCV, 2019.
- K. Xu et al., “Representation learning on graphs with jumping knowledge networks,” in ICML, 2018.
- R. Ying et al., “Hierarchical graph representation learning with differentiable pooling,” in NeurIPS, 2018.
- Q. Li, Z. Han, and X.-M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” in AAAI, 2018.
- E. Jang, S. Gu, and B. Poole, “Categorical reparameterization with gumbel-softmax,” in ICLR, 2017.
- C. J. Maddison, A. Mnih, and Y. W. Teh, “The concrete distribution: A continuous relaxation of discrete random variables,” in ICLR, 2017.
- H. Shi et al., “Bridging the gap between sample-based and one-shot neural architecture search with bonas,” NeurIPS, 2020.
- Z. Guo et al., “Single path one-shot neural architecture search with uniform sampling,” in ECCV, 2020.
- E. Real et al., “Large-scale evolution of image classifiers,” in ICML, 2017.
- H. Li, X. Wang, Z. Zhang, Z. Yuan, H. Li, and W. Zhu, “Disentangled contrastive learning on graphs,” Advances in Neural Information Processing Systems, vol. 34, pp. 21 872–21 884, 2021.
- H. Li, Z. Zhang, X. Wang, and W. Zhu, “Disentangled graph contrastive learning with independence promotion,” IEEE Transactions on Knowledge and Data Engineering, 2022.
- Y. Ding, Q. Yao, H. Zhao, and T. Zhang, “Diffmg: Differentiable meta graph search for heterogeneous graph neural networks,” in Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021, pp. 279–288.
- Y. Gao, P. Zhang, C. Zhou, H. Yang, Z. Li, Y. Hu, and S. Y. Philip, “Hgnas++: efficient architecture search for heterogeneous graph neural networks,” IEEE Transactions on Knowledge and Data Engineering, 2023.
- Z. Zhang, X. Wang, C. Guan, Z. Zhang, H. Li, and W. Zhu, “AutoGT: Automated graph transformer architecture search,” in The Eleventh International Conference on Learning Representations, 2023. [Online]. Available: https://openreview.net/forum?id=GcM7qfl5zY
- C. Ying, T. Cai, S. Luo, S. Zheng, G. Ke, D. He, Y. Shen, and T.-Y. Liu, “Do transformers really perform badly for graph representation?” in Advances in Neural Information Processing Systems, A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, Eds., 2021. [Online]. Available: https://openreview.net/forum?id=OeWooOxFwDa
- J. You et al., “Graph structure of neural networks,” in ICML, 2020.
- M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch Geometric,” in ICLR Workshop, 2019.
- M. Wang, D. Zheng, Z. Ye, Q. Gan, M. Li, X. Song, J. Zhou, C. Ma, L. Yu, Y. Gai et al., “Deep graph library: A graph-centric, highly-performant package for graph neural networks,” arXiv preprint arXiv:1909.01315, 2019.
- P. W. Battaglia et al., “Relational inductive biases, deep learning, and graph networks,” arXiv:1806.01261, 2018.
- R. Zhu et al., “Aligraph: A comprehensive graph neural network platform,” VLDB, 2019.
- Alibaba, “Euler: A distributed graph deep learning framework,” https://github.com/alibaba/euler, 2019.
- A. Lerer et al., “PyTorch-BigGraph: A Large-scale Graph Embedding System,” in SysML, 2019.
- NLP and P. T. at Baidu, “Paddle graph learning,” https://github.com/PaddlePaddle/PGL, 2021, [Online; accessed 28-Dec-2021].
- J. P. Sibon Li et al., “Tensorflow gnn,” https://github.com/tensorflow/gnn, 2021, [Online; accessed 28-Dec-2021].
- C. Data61, “Stellargraph machine learning library,” https://github.com/stellargraph/stellargraph, 2018.
- D. Grattarola and C. Alippi, “Graph neural networks in tensorflow and keras with spektral,” ICML workshop, 2020.
- T. U. Knowledge Engineering Group, “Cogdl: An extensive research toolkit for deep learning on graphs,” https://github.com/THUDM/cogdl, 2020.
- N. L. P. L. at Tsinghua University, “Openne: An open source toolkit for network embedding,” https://github.com/thunlp/OpenNE, 2018.
- O. T. at GAMMA LAB and D. Team, “Heterogeneous graph neural network,” https://github.com/BUPT-GAMMA/OpenHGNN, 2021, [Online; accessed 28-Dec-2021].
- P. Goyal and E. Ferrara, “Gem: A python package for graph embedding methods,” JOSS, 2018.
- B. Rozemberczki, O. Kiss, and R. Sarkar, “Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs,” in CIKM, 2020.
- A. Hagberg, P. Swart, and D. S Chult, “Exploring network structure, dynamics, and function using networkx,” Los Alamos National Lab.(LANL), Tech. Rep., 2008.
- Q. Zhang et al., “Retiarii: A deep learning exploratory-training framework,” in OSDI, 2020.
- H. Jin, Q. Song, and X. Hu, “Auto-keras: An efficient neural architecture search system,” in KDD, 2019.
- M. Feurer et al., “Auto-sklearn: efficient and robust automated machine learning,” in Automated Machine Learning, 2019.
- J. Bergstra, D. Yamins, and D. Cox, “Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures,” in ICML, 2013.
- R. S. Olson et al., “Evaluation of a tree-based pipeline optimization tool for automating data science,” in GECCO, 2016.
- N. Erickson et al., “Autogluon-tabular: Robust and accurate automl for structured data,” arXiv:2003.06505, 2020.
- A. de Romblay et al., “Mlbox, machine learning box,” https://github.com/AxeldeRomblay/MLBox, 2018.
- MLJAR, “Mljar automated machine learning,” https://github.com/mljar/mljar-supervised, 2019.
- P. Sen et al., “Collective classification in network data,” AI magazine, 2008.
- O. Shchur et al., “Pitfalls of graph neural network evaluation,” Relational Representation Learning Workshop, NeurIPS 2018, 2018.
- W. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in NeurIPS, 2017.
- A. K. Debnath, R. L. Lopez de Compadre, G. Debnath, A. J. Shusterman, and C. Hansch, “Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. correlation with molecular orbital energies and hydrophobicity,” Journal of medicinal chemistry, vol. 34, no. 2, pp. 786–797, 1991.
- K. M. Borgwardt et al., “Protein function prediction via graph kernels,” Bioinformatics, 2005.
- P. Yanardag and S. Vishwanathan, “Deep graph kernels,” in KDD, 2015.
- R. Milo, S. Shen-Orr, S. Itzkovitz, N. Kashtan, D. Chklovskii, and U. Alon, “Network motifs: simple building blocks of complex networks,” Science, vol. 298, no. 5594, pp. 824–827, 2002.
- Z. Zhang, P. Cui, J. Pei, X. Wang, and W. Zhu, “Eigen-gnn: A graph structure preserving plug-in for gnns,” IEEE Transactions on Knowledge and Data Engineering, 2021.
- S. Brin and L. Page, “The anatomy of a large-scale hypertextual web search engine,” Computer networks and ISDN systems, 1998.
- G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and T.-Y. Liu, “Lightgbm: A highly efficient gradient boosting decision tree,” Advances in neural information processing systems, vol. 30, 2017.
- R. A. Rossi, R. Zhou, and N. K. Ahmed, “Deep inductive graph representation learning,” TKDE, 2018.
- A. Tsitsulin, D. Mottin, P. Karras, A. Bronstein, and E. Müller, “Netlsd: hearing the shape of a graph,” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018, pp. 2347–2356.
- P. Veličković et al., “Graph Attention Networks,” in ICLR, 2018.
- K. Xu et al., “How powerful are graph neural networks?” in International Conference on Learning Representations (ICLR), 2019.
- H. Gao and S. Ji, “Graph u-nets,” in Proceedings of the 36th International Conference on Machine Learning, 2019.
- F. Errica et al., “A fair comparison of graph neural networks for graph classification,” in ICLR, 2020.
- Y. Qin, Z. Zhang, X. Wang, Z. Zhang, and W. Zhu, “Nas-bench-graph: Benchmarking graph neural architecture search,” Advances in Neural Information Processing Systems, vol. 35, pp. 54–69, 2022.
- M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” Advances in neural information processing systems, vol. 29, pp. 3844–3852, 2016.
- F. M. Bianchi, D. Grattarola, L. Livi, and C. Alippi, “Graph neural networks with convolutional arma filters,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.
- C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, and M. Grohe, “Weisfeiler and leman go neural: Higher-order graph neural networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, 2019, pp. 4602–4609.
- O. Shchur, M. Mumme, A. Bojchevski, and S. Günnemann, “Pitfalls of graph neural network evaluation,” Relational Representation Learning Workshop, NeurIPS 2018, 2018.
- W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, “Open graph benchmark: Datasets for machine learning on graphs,” Advances in neural information processing systems, vol. 33, pp. 22 118–22 133, 2020.
- Z. Yang, W. Cohen, and R. Salakhudinov, “Revisiting semi-supervised learning with graph embeddings,” in International conference on machine learning. PMLR, 2016, pp. 40–48.
- Microsoft, “Neural Network Intelligence,” 1 2021. [Online]. Available: https://github.com/microsoft/nni
- Y. Gao, H. Yang, P. Zhang, C. Zhou, and Y. Hu, “Graph neural architecture search,” in Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, vol. 20, 2020, pp. 1403–1409.
- L. Li and A. Talwalkar, “Random search and reproducibility for neural architecture search,” in Uncertainty in artificial intelligence. PMLR, 2020, pp. 367–377.
- H. Yuan et al., “Explainability in graph neural networks: A taxonomic survey,” arXiv:2012.15445, 2020.
- H. Li, X. Wang, Z. Zhang, and W. Zhu, “Ood-gnn: Out-of-distribution generalized graph neural network,” arXiv:2112.03806, 2021.
- ——, “Out-of-distribution generalization on graphs: A survey,” arXiv preprint arXiv:2202.07987, 2022.
- H. Li, Z. Zhang, X. Wang, and W. Zhu, “Learning invariant graph representations for out-of-distribution generalization,” Advances in Neural Information Processing Systems, vol. 35, pp. 11 828–11 841, 2022.
- ——, “Invariant node representation learning under distribution shifts with multiple latent environments,” ACM Transactions on Information Systems, vol. 42, no. 1, pp. 1–30, 2023.
- L. Sun et al., “Adversarial attack and defense on graph data: A survey,” arXiv:1812.10528, 2018.
- S. Xie et al., “Exploring randomly wired neural networks for image recognition,” in ICCV, 2019.
- C. Zhang, M. Ren, and R. Urtasun, “Graph hypernetworks for neural architecture search,” in ICLR, 2018.
- L. Dudziak et al., “Brp-nas: Prediction-based nas using gcns,” NeurIPS, 2020.
- Y. Qin, X. Wang, P. Cui, and W. Zhu, “Gqnas: Graph q network for neural architecture search,” in ICDM, 2002.
- A. Auten, M. Tomei, and R. Kumar, “Hardware acceleration of graph neural networks,” in DAC, 2020.
- H. Cai, L. Zhu, and S. Han, “Proxylessnas: Direct neural architecture search on target task and hardware,” in International Conference on Learning Representations, 2018.
- M. Tan et al., “Mnasnet: Platform-aware neural architecture search for mobile,” in CVPR, 2019.
- Y. Jiang, X. Wang, and W. Zhu, “Hardware-aware transformable architecture search with efficient search space,” in ICME, 2020.
- V. P. Dwivedi et al., “Benchmarking graph neural networks,” arXiv:2003.00982, 2020.
- C. Ying et al., “Nas-bench-101: Towards reproducible neural architecture search,” in ICML, 2019.