AutoGL: A Library for Automated Graph Learning (2104.04987v4)
Abstract: Recent years have witnessed an upsurge in research interests and applications of machine learning on graphs. However, manually designing the optimal machine learning algorithms for different graph datasets and tasks is inflexible, labor-intensive, and requires expert knowledge, limiting its adaptivity and applicability. Automated machine learning (AutoML) on graphs, aiming to automatically design the optimal machine learning algorithm for a given graph dataset and task, has received considerable attention. However, none of the existing libraries can fully support AutoML on graphs. To fill this gap, we present Automated Graph Learning (AutoGL), the first dedicated library for automated machine learning on graphs. AutoGL is open-source, easy to use, and flexible to be extended. Specifically, we propose a three-layer architecture, consisting of backends to interface with devices, a complete automated graph learning pipeline, and supported graph applications. The automated machine learning pipeline further contains five functional modules: auto feature engineering, neural architecture search, hyper-parameter optimization, model training, and auto ensemble, covering the majority of existing AutoML methods on graphs. For each module, we provide numerous state-of-the-art methods and flexible base classes and APIs, which allow easy usage and customization. We further provide experimental results to showcase the usage of our AutoGL library. We also present AutoGL-light, a lightweight version of AutoGL to facilitate customizing pipelines and enriching applications, as well as benchmarks for graph neural architecture search. The codes of AutoGL are publicly available at https://github.com/THUMNLab/AutoGL.
- Z. Zhang, P. Cui, and W. Zhu, “Deep learning on graphs: A survey,” IEEE Transactions on Knowledge and Data Engineering, 2020.
- J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, and M. Sun, “Graph neural networks: A review of methods and applications,” AI open, vol. 1, pp. 57–81, 2020.
- Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and S. Y. Philip, “A comprehensive survey on graph neural networks,” IEEE transactions on neural networks and learning systems, 2020.
- J. Ma, C. Zhou, P. Cui, H. Yang, and W. Zhu, “Learning disentangled representations for recommendation,” in Advances in Neural Information Processing Systems, vol. 32, 2019.
- H. Li, X. Wang, Z. Zhang, J. Ma, P. Cui, and W. Zhu, “Intention-aware sequential recommendation with structured intent transition,” IEEE Transactions on Knowledge and Data Engineering, 2021.
- X. Wang, P. Cui, J. Wang, J. Pei, W. Zhu, and S. Yang, “Community preserving network embedding,” in Proceedings of the AAAI conference on artificial intelligence, vol. 31, no. 1, 2017.
- W. Jiang and J. Luo, “Graph neural network for traffic forecasting: A survey,” Expert Systems with Applications, vol. 207, p. 117921, 2022.
- J. Shlomi, P. Battaglia, and J.-R. Vlimant, “Graph neural networks in particle physics,” Machine Learning: Science and Technology, 2020.
- M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, and P. Vandergheynst, “Geometric deep learning: Going beyond euclidean data,” IEEE Signal Process. Mag., vol. 34, no. 4, pp. 18–42, 2017.
- C. Su, J. Tong, Y. Zhu, P. Cui, and F. Wang, “Network embedding in biomedical data science,” Briefings in bioinformatics, vol. 21, no. 1, pp. 182–197, 2020.
- Y. Bengio, A. Lodi, and A. Prouvost, “Machine learning for combinatorial optimization: a methodological tour d’horizon,” European Journal of Operational Research, 2020.
- Z. Zhang, X. Wang, and W. Zhu, “Automated machine learning on graphs: A survey,” in Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, 2021, survey track.
- Q. Yao, M. Wang, Y. Chen, W. Dai, Y.-F. Li, W.-W. Tu, Q. Yang, and Y. Yu, “Taking human out of learning applications: A survey on automated machine learning,” arXiv:1810.13306, 2018.
- K. Tu, J. Ma, P. Cui, J. Pei, and W. Zhu, “Autone: Hyperparameter optimization for massive network embedding,” in Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019, pp. 216–225.
- Y. Gao, H. Yang, P. Zhang, C. Zhou, and Y. Hu, “Graph neural architecture search,” in Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI-20, 7 2020, pp. 1403–1409.
- Y. Gao, P. Zhang, C. Zhou, H. Yang, Z. Li, Y. Hu, and S. Y. Philip, “Hgnas++: efficient architecture search for heterogeneous graph neural networks,” IEEE Transactions on Knowledge and Data Engineering, 2023.
- Y. Gao, P. Zhang, H. Yang, C. Zhou, Z. Tian, Y. Hu, Z. Li, and J. Zhou, “Graphnas++: Distributed architecture search for graph neural networks,” IEEE Transactions on Knowledge and Data Engineering, 2022.
- C. Wang, B. Chen, G. Li, and H. Wang, “Automated graph neural network search under federated learning framework,” IEEE Transactions on Knowledge and Data Engineering, 2023.
- M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch Geometric,” in ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
- M. Wang, D. Zheng, Z. Ye, Q. Gan, M. Li, X. Song, J. Zhou, C. Ma, L. Yu, Y. Gai, T. Xiao, T. He, G. Karypis, J. Li, and Z. Zhang, “Deep graph library: A graph-centric, highly-performant package for graph neural networks,” arXiv:1909.01315, 2019.
- P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner et al., “Relational inductive biases, deep learning, and graph networks,” arXiv:1806.01261, 2018.
- R. Zhu, K. Zhao, H. Yang, W. Lin, C. Zhou, B. Ai, Y. Li, and J. Zhou, “Aligraph: A comprehensive graph neural network platform,” Proc. VLDB Endow., vol. 12, no. 12, p. 2094–2105, 2019.
- A. Lerer, L. Wu, J. Shen, T. Lacroix, L. Wehrstedt, A. Bose, and A. Peysakhovich, “PyTorch-BigGraph: A Large-scale Graph Embedding System,” in Proceedings of the 2nd SysML Conference, 2019.
- H. Jin, Q. Song, and X. Hu, “Auto-keras: An efficient neural architecture search system,” in Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019, pp. 1946–1956.
- M. Feurer, A. Klein, K. Eggensperger, J. T. Springenberg, M. Blum, and F. Hutter, “Auto-sklearn: efficient and robust automated machine learning,” in Automated Machine Learning. Springer, Cham, 2019, pp. 113–134.
- J. Bergstra, D. Yamins, and D. Cox, “Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures,” in International conference on machine learning, 2013, pp. 115–123.
- Q. Zhang, Z. Han, F. Yang, Y. Zhang, Z. Liu, M. Yang, and L. Zhou, “Retiarii: A deep learning exploratory-training framework,” in 14th {normal-{\{{USENIX}normal-}\}} Symposium on Operating Systems Design and Implementation ({normal-{\{{OSDI}normal-}\}} 20), 2020, pp. 919–936.
- W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, “Open graph benchmark: Datasets for machine learning on graphs,” Proceedings of the 34th International Conference on Neural Information Processing Systems, 2020.
- P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad, “Collective classification in network data,” AI magazine, vol. 29, no. 3, pp. 93–93, 2008.
- O. Shchur, M. Mumme, A. Bojchevski, and S. Günnemann, “Pitfalls of graph neural network evaluation,” Relational Representation Learning Workshop, NeurIPS 2018, 2018.
- W. L. Hamilton, R. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017, pp. 1025–1035.
- C. Morris, N. M. Kriege, F. Bause, K. Kersting, P. Mutzel, and M. Neumann, “Tudataset: A collection of benchmark datasets for learning with graphs,” in ICML 2020 Workshop on Graph Representation Learning and Beyond (GRL+ 2020), 2020.
- A. K. Debnath, R. L. Lopez de Compadre, G. Debnath, A. J. Shusterman, and C. Hansch, “Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. correlation with molecular orbital energies and hydrophobicity,” Journal of medicinal chemistry, vol. 34, no. 2, pp. 786–797, 1991.
- K. M. Borgwardt, C. S. Ong, S. Schönauer, S. Vishwanathan, A. J. Smola, and H.-P. Kriegel, “Protein function prediction via graph kernels,” Bioinformatics, vol. 21, pp. i47–i56, 2005.
- P. Yanardag and S. Vishwanathan, “Deep graph kernels,” in Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining, 2015, pp. 1365–1374.
- R. Milo, S. Shen-Orr, S. Itzkovitz, N. Kashtan, D. Chklovskii, and U. Alon, “Network motifs: simple building blocks of complex networks,” Science, vol. 298, no. 5594, pp. 824–827, 2002.
- Z. Zhang, P. Cui, J. Pei, X. Wang, and W. Zhu, “Eigen-gnn: A graph structure preserving plug-in for gnns,” IEEE Transactions on Knowledge and Data Engineering, 2021.
- S. Brin and L. Page, “The anatomy of a large-scale hypertextual web search engine,” Computer networks and ISDN systems, vol. 30, no. 1-7, pp. 107–117, 1998.
- G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and T.-Y. Liu, “Lightgbm: A highly efficient gradient boosting decision tree,” in Advances in Neural Information Processing Systems, vol. 30, 2017.
- R. A. Rossi, R. Zhou, and N. K. Ahmed, “Deep inductive graph representation learning,” IEEE Transactions on Knowledge and Data Engineering, vol. 32, no. 3, pp. 438–452, 2018.
- A. Tsitsulin, D. Mottin, P. Karras, A. Bronstein, and E. Müller, “Netlsd: hearing the shape of a graph,” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018, pp. 2347–2356.
- A. A. Hagberg, D. A. Schult, and P. J. Swart, “Exploring network structure, dynamics, and function using networkx,” in Proceedings of the 7th Python in Science Conference, 2008, pp. 11 – 15.
- N. Entezari, S. A. Al-Sayouri, A. Darvishzadeh, and E. E. Papalexakis, “All you need is low (rank) defending against adversarial attacks on graphs,” in Proceedings of the 13th International Conference on Web Search and Data Mining, 2020, pp. 169–177.
- H. Wu, C. Wang, Y. Tyshetskiy, A. Docherty, K. Lu, and L. Zhu, “Adversarial examples for graph data: deep insights into attack and defense,” in Proceedings of the 28th International Joint Conference on Artificial Intelligence, 2019, pp. 4816–4823.
- T. Elsken, J. H. Metzen, and F. Hutter, “Neural architecture search: A survey,” The Journal of Machine Learning Research, vol. 20, no. 1, pp. 1997–2017, 2019.
- C. Guan, X. Wang, and W. Zhu, “Autoattend: Automated attention representation search,” in International conference on machine learning, 2021, pp. 3864–3874.
- L. Li and A. Talwalkar, “Random search and reproducibility for neural architecture search,” in Uncertainty in artificial intelligence, 2020, pp. 367–377.
- B. Zoph and Q. Le, “Neural architecture search with reinforcement learning,” in International Conference on Learning Representations, 2016.
- Y. Liu, Y. Sun, B. Xue, M. Zhang, G. G. Yen, and K. C. Tan, “A survey on evolutionary neural architecture search,” IEEE transactions on neural networks and learning systems, 2021.
- H. Liu, K. Simonyan, and Y. Yang, “Darts: Differentiable architecture search,” in International Conference on Learning Representations, 2019.
- H. Pham, M. Guan, B. Zoph, Q. Le, and J. Dean, “Efficient neural architecture search via parameters sharing,” in International conference on machine learning, 2018, pp. 4095–4104.
- Z. Guo, X. Zhang, H. Mu, W. Heng, Z. Liu, Y. Wei, and J. Sun, “Single path one-shot neural architecture search with uniform sampling,” in European Conference on Computer Vision, 2020, pp. 544–560.
- H. Benmeziane, K. El Maghraoui, H. Ouarnoughi, S. Niar, M. Wistuba, and N. Wang, “Hardware-aware neural architecture search: Survey and taxonomy,” in Thirtieth International Joint Conference on Artificial Intelligence {normal-{\{{IJCAI-21}normal-}\}}, 2021, pp. 4322–4329.
- K. Zhou, X. Huang, Q. Song, R. Chen, and X. Hu, “Auto-gnn: Neural architecture search of graph neural networks,” Frontiers in big Data, vol. 5, p. 1029307, 2022.
- Y. Qin, X. Wang, Z. Zhang, and W. Zhu, “Graph differentiable architecture search with structure learning,” Advances in neural information processing systems, vol. 34, pp. 16 860–16 872, 2021.
- B. Xie, H. Chang, Z. Zhang, X. Wang, D. Wang, Z. Zhang, Z. Ying, and W. Zhu, “Adversarially robust neural architecture search for graph neural networks,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023.
- D. Golovin, B. Solnik, S. Moitra, G. Kochanski, J. Karro, and D. Sculley, “Google vizier: A service for black-box optimization,” in Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, 2017, pp. 1487–1495.
- J. Bergstra and Y. Bengio, “Random search for hyper-parameter optimization.” Journal of machine learning research, vol. 13, no. 2, 2012.
- J. Wu, X.-Y. Chen, H. Zhang, L.-D. Xiong, H. Lei, and S.-H. Deng, “Hyperparameter optimization for machine learning models based on bayesian optimization,” Journal of Electronic Science and Technology, vol. 17, no. 1, pp. 26–40, 2019.
- J. Bergstra, R. Bardenet, Y. Bengio, and B. Kégl, “Algorithms for hyper-parameter optimization,” in 25th annual conference on neural information processing systems (NIPS 2011), vol. 24, 2011.
- D. V. Arnold and N. Hansen, “Active covariance matrix adaptation for the (1+ 1)-cma-es,” in Proceedings of the 12th annual conference on Genetic and evolutionary computation, 2010, pp. 385–392.
- T. Voß, N. Hansen, and C. Igel, “Improved step size adaptation for the mo-cma-es,” in Proceedings of the 12th annual conference on Genetic and evolutionary computation, 2010, pp. 487–494.
- P. Bratley, B. L. Fox, and H. Niederreiter, “Programs to generate niederreiter’s low-discrepancy sequences,” ACM Transactions on Mathematical Software (TOMS), vol. 20, no. 4, pp. 494–495, 1994.
- T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in International Conference on Learning Representations (ICLR), 2017.
- P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, “Graph Attention Networks,” International Conference on Learning Representations, 2018.
- K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” in International Conference on Learning Representations, 2019.
- H. Gao and S. Ji, “Graph u-nets,” in Proceedings of the 36th International Conference on Machine Learning, 2019.
- W. L. Hamilton, R. Ying, and J. Leskovec, “Representation learning on graphs: Methods and applications,” arXiv preprint arXiv:1709.05584, 2017.
- M. Zhang, Z. Cui, M. Neumann, and Y. Chen, “An end-to-end deep learning architecture for graph classification,” in Proceedings of the AAAI conference on artificial intelligence, vol. 32, no. 1, 2018.
- Y. Liu, M. Jin, S. Pan, C. Zhou, Y. Zheng, F. Xia, and S. Y. Philip, “Graph self-supervised learning: A survey,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, no. 6, pp. 5879–5900, 2022.
- W. Jin, X. Liu, X. Zhao, Y. Ma, N. Shah, and J. Tang, “Automated self-supervised learning for graphs,” in The 10th International Conference on Learning Representations (ICLR 2022), 2022.
- Y. You, T. Chen, Y. Sui, T. Chen, Z. Wang, and Y. Shen, “Graph contrastive learning with augmentations,” Advances in neural information processing systems, vol. 33, pp. 5812–5823, 2020.
- L. Sun, Y. Dou, C. Yang, K. Zhang, J. Wang, S. Y. Philip, L. He, and B. Li, “Adversarial attack and defense on graph data: A survey,” IEEE Transactions on Knowledge and Data Engineering, 2022.
- Y. Li, W. Jin, H. Xu, and J. Tang, “Deeprobust: a platform for adversarial attacks and defenses,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 18, 2021, pp. 16 078–16 080.
- X. Zhang and M. Zitnik, “Gnnguard: Defending graph neural networks against adversarial attacks,” Advances in neural information processing systems, vol. 33, pp. 9263–9275, 2020.
- M. Jin, H. Chang, W. Zhu, and S. Sojoudi, “Power up! robust graph convolutional network via graph powering,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 9, 2021, pp. 8004–8012.
- X. Wang, H. Ji, C. Shi, B. Wang, Y. Ye, P. Cui, and P. S. Yu, “Heterogeneous graph attention network,” in The world wide web conference, 2019, pp. 2022–2032.
- Z. Hu, Y. Dong, K. Wang, and Y. Sun, “Heterogeneous graph transformer,” in Proceedings of the web conference 2020, 2020, pp. 2704–2710.
- F. Errica, M. Podda, D. Bacciu, and A. Micheli, “A fair comparison of graph neural networks for graph classification,” in International Conference on Learning Representations, 2020.
- Y. Qin, X. Wang, Z. Zhang, P. Xie, and W. Zhu, “Graph neural architecture search under distribution shifts,” in International Conference on Machine Learning, 2022, pp. 18 083–18 095.
- C. Guan, X. Wang, H. Chen, Z. Zhang, and W. Zhu, “Large-scale graph neural architecture search,” in International Conference on Machine Learning, 2022, pp. 7968–7981.
- Z. Zhang, X. Wang, C. Guan, Z. Zhang, H. Li, and W. Zhu, “Autogt: Automated graph transformer architecture search,” in The Eleventh International Conference on Learning Representations, 2023.
- J. Wang, A. Ma, Y. Chang, J. Gong, Y. Jiang, R. Qi, C. Wang, H. Fu, Q. Ma, and D. Xu, “scgnn is a novel graph neural network framework for single-cell rna-seq analyses,” Nature communications, vol. 12, no. 1, p. 1882, 2021.
- Y. Wang, J. Wang, Z. Cao, and A. Barati Farimani, “Molecular contrastive learning of representations via graph neural networks,” Nature Machine Intelligence, vol. 4, no. 3, pp. 279–287, 2022.
- S. Jiang, S. Qin, R. C. Van Lehn, P. Balaprakash, and V. M. Zavala, “Uncertainty quantification for molecular property predictions with graph neural architecture search,” arXiv preprint arXiv:2307.10438, 2023.
- Z. Zhang, X. Wang, Z. Zhang, G. Shen, S. Shen, and W. Zhu, “Unsupervised graph neural architecture search with disentangled self-supervision,” Advances in Neural Information Processing Systems, vol. 36, 2023.
- Y. Qin, X. Wang, Z. Zhang, H. Chen, and W. Zhu, “Multi-task graph neural architecture search with task-aware collaboration and curriculum,” Advances in neural information processing systems, vol. 36, 2023.
- B. M. Oloulade, J. Gao, J. Chen, R. Al-Sabri, and Z. Wu, “Cancer drug response prediction with surrogate modeling-based graph neural architecture search,” Bioinformatics, vol. 39, no. 8, 2023.
- G. Jin, H. Yan, F. Li, Y. Li, and J. Huang, “Dual graph convolution architecture search for travel time estimation,” ACM Transactions on Intelligent Systems and Technology, vol. 14, no. 4, pp. 1–23, 2023.
- Y. Qin, Z. Zhang, X. Wang, Z. Zhang, and W. Zhu, “Nas-bench-graph: Benchmarking graph neural architecture search,” in Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track, 2022.
- M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” Advances in neural information processing systems, vol. 29, 2016.
- F. M. Bianchi, D. Grattarola, L. Livi, and C. Alippi, “Graph neural networks with convolutional arma filters,” IEEE transactions on pattern analysis and machine intelligence, vol. 44, no. 7, pp. 3496–3507, 2021.
- C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, and M. Grohe, “Weisfeiler and leman go neural: Higher-order graph neural networks,” in Proceedings of the AAAI conference on artificial intelligence, vol. 33, no. 01, 2019, pp. 4602–4609.
- K. Xu, C. Li, Y. Tian, T. Sonobe, K.-i. Kawarabayashi, and S. Jegelka, “Representation learning on graphs with jumping knowledge networks,” in International conference on machine learning, 2018, pp. 5453–5462.
- G. Li, M. Muller, A. Thabet, and B. Ghanem, “Deepgcns: Can gcns go as deep as cnns?” in Proceedings of the IEEE/CVF international conference on computer vision, 2019, pp. 9267–9276.