Mass Spectra Prediction with Structural Motif-based Graph Neural Networks (2306.16085v1)
Abstract: Mass spectra, which are agglomerations of ionized fragments from targeted molecules, play a crucial role across various fields for the identification of molecular structures. A prevalent analysis method involves spectral library searches,where unknown spectra are cross-referenced with a database. The effectiveness of such search-based approaches, however, is restricted by the scope of the existing mass spectra database, underscoring the need to expand the database via mass spectra prediction. In this research, we propose the Motif-based Mass Spectrum Prediction Network (MoMS-Net), a system that predicts mass spectra using the information derived from structural motifs and the implementation of Graph Neural Networks (GNNs). We have tested our model across diverse mass spectra and have observed its superiority over other existing models. MoMS-Net considers substructure at the graph level, which facilitates the incorporation of long-range dependencies while using less memory compared to the graph transformer model.
- Lebedev, A.T.: Environmental mass spectrometry. Annual review of analytical chemistry 6, 163–189 (2013) Aebersold and Mann [2016] Aebersold, R., Mann, M.: Mass-spectrometric exploration of proteome structure and function. Nature 537(7620), 347–355 (2016) Gowda and Djukovic [2014] Gowda, G.N., Djukovic, D.: Overview of mass spectrometry-based metabolomics: opportunities and challenges. Mass Spectrometry in Metabolomics: Methods and Protocols, 3–12 (2014) De Vijlder et al. [2018] De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Aebersold, R., Mann, M.: Mass-spectrometric exploration of proteome structure and function. Nature 537(7620), 347–355 (2016) Gowda and Djukovic [2014] Gowda, G.N., Djukovic, D.: Overview of mass spectrometry-based metabolomics: opportunities and challenges. Mass Spectrometry in Metabolomics: Methods and Protocols, 3–12 (2014) De Vijlder et al. [2018] De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Gowda, G.N., Djukovic, D.: Overview of mass spectrometry-based metabolomics: opportunities and challenges. Mass Spectrometry in Metabolomics: Methods and Protocols, 3–12 (2014) De Vijlder et al. [2018] De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Aebersold, R., Mann, M.: Mass-spectrometric exploration of proteome structure and function. Nature 537(7620), 347–355 (2016) Gowda and Djukovic [2014] Gowda, G.N., Djukovic, D.: Overview of mass spectrometry-based metabolomics: opportunities and challenges. Mass Spectrometry in Metabolomics: Methods and Protocols, 3–12 (2014) De Vijlder et al. [2018] De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Gowda, G.N., Djukovic, D.: Overview of mass spectrometry-based metabolomics: opportunities and challenges. Mass Spectrometry in Metabolomics: Methods and Protocols, 3–12 (2014) De Vijlder et al. [2018] De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Gowda, G.N., Djukovic, D.: Overview of mass spectrometry-based metabolomics: opportunities and challenges. Mass Spectrometry in Metabolomics: Methods and Protocols, 3–12 (2014) De Vijlder et al. [2018] De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- De Vijlder, T., Valkenborg, D., Lemière, F., Romijn, E.P., Laukens, K., Cuyckens, F.: A tutorial in small molecule identification via electrospray ionization-mass spectrometry: The practical art of structural elucidation. Mass spectrometry reviews 37(5), 607–629 (2018) Stein [1995] Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Stein, S.E.: Chemical substructure identification by mass spectral library searching. Journal of the American Society for Mass Spectrometry 6(8), 644–655 (1995) Stein and Scott [1994] Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Stein, S.E., Scott, D.R.: Optimization and testing of mass spectral library search algorithms for compound identification. Journal of the American Society for Mass Spectrometry 5(9), 859–866 (1994) Stein [2017] Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Stein, S.: Mass spectral database. National Institute of Standards and Technology (NIST) (2017) Wiley et al. [2006] Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Wiley, J., et al.: Wiley Registry of Mass Spectral Data, (2006) MoNA [2021] MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- MoNA: MassBank of North America. Massbank of North America (2021) Ji et al. [2020] Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Ji, H., Deng, H., Lu, H., Zhang, Z.: Predicting a molecular fingerprint from an electron ionization mass spectrum with deep neural networks. Analytical Chemistry 92(13), 8649–8653 (2020) Eng et al. [1994] Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Eng, J.K., McCormack, A.L., Yates, J.R.: An approach to correlate tandem mass spectral data of peptides with amino acid sequences in a protein database. Journal of the american society for mass spectrometry 5(11), 976–989 (1994) Tran et al. [2017] Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Tran, N.H., Zhang, X., Xin, L., Shan, B., Li, M.: De novo peptide sequencing by deep learning. Proceedings of the National Academy of Sciences 114(31), 8247–8252 (2017) Dührkop et al. [2015] Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Dührkop, K., Shen, H., Meusel, M., Rousu, J., Böcker, S.: Searching molecular structure databases with tandem mass spectra using csi: Fingerid. Proceedings of the National Academy of Sciences 112(41), 12580–12585 (2015) Bauer and Grimme [2016] Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Bauer, C.A., Grimme, S.: How to compute electron ionization mass spectra from first principles. The Journal of Physical Chemistry A 120(21), 3755–3766 (2016) Grimme [2013] Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Grimme, S.: Towards first principles calculation of electron impact mass spectra of molecules. Angewandte Chemie International Edition 52(24), 6306–6312 (2013) Guerra et al. [2012] Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Guerra, M., Parente, F., Indelicato, P., Santos, J.: Modified binary encounter bethe model for electron-impact ionization. International Journal of Mass Spectrometry 313, 1–7 (2012) Ásgeirsson et al. [2017] Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Ásgeirsson, V., Bauer, C.A., Grimme, S.: Quantum chemical calculation of electron ionization mass spectra for general organic and inorganic molecules. Chemical Science 8(7), 4879–4895 (2017) Allen et al. [2016] Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Allen, F., Pon, A., Greiner, R., Wishart, D.: Computational prediction of electron ionization mass spectra to assist in gc/ms compound identification. Analytical chemistry 88(15), 7689–7697 (2016) Wei et al. [2019] Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Wei, J.N., Belanger, D., Adams, R.P., Sculley, D.: Rapid prediction of electron–ionization mass spectrometry using neural networks. ACS central science 5(4), 700–708 (2019) Liu et al. [2020] Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Liu, K., Li, S., Wang, L., Ye, Y., Tang, H.: Full-spectrum prediction of peptides tandem mass spectra using deep neural network. Analytical chemistry 92(6), 4275–4283 (2020) Zhang et al. [2022] Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Zhang, B., Zhang, J., Xia, Y., Chen, P., Wang, B.: Prediction of electron ionization mass spectra based on graph convolutional networks. International Journal of Mass Spectrometry 475, 116817 (2022) Young et al. [2021] Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Young, A., Wang, B., Röst, H.: Massformer: Tandem mass spectrum prediction with graph transformers. arXiv preprint arXiv:2111.04824 (2021) Murphy et al. [2023] Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Murphy, M., Jegelka, S., Fraenkel, E., Kind, T., Healey, D., Butler, T.: Efficiently predicting high resolution mass spectra with graph neural networks. International Conference on Machine Learning (2023) Milo et al. [2002] Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298(5594), 824–827 (2002) Yu and Gao [2022] Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Yu, Z., Gao, H.: Molecular representation learning via heterogeneous motif graph neural networks. In: International Conference on Machine Learning, pp. 25581–25594 (2022). PMLR Zhang and Li [2021] Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Zhang, M., Li, P.: Nested graph neural networks. Advances in Neural Information Processing Systems 34, 15734–15747 (2021) Bouritsas et al. [2022] Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Bouritsas, G., Frasca, F., Zafeiriou, S., Bronstein, M.M.: Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence 45(1), 657–668 (2022) Jin et al. [2020] Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Jin, W., Barzilay, R., Jaakkola, T.: Hierarchical generation of molecular graphs using structural motifs. In: International Conference on Machine Learning, pp. 4839–4848 (2020). PMLR Chen et al. [2022] Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Chen, D., O’Bray, L., Borgwardt, K.: Structure-aware transformer for graph representation learning. In: International Conference on Machine Learning, pp. 3469–3489 (2022). PMLR Rao et al. [2022] Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Rao, J., Zheng, S., Mai, S., Yang, Y.: Communicative subgraph representation learning for multi-relational inductive drug-gene interaction prediction. International Joint Conferences on Artificial Intelligence (IJCAI) (2022) Geng et al. [2023] Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Geng, Z., Xie, S., Xia, Y., Wu, L., Qin, T., Wang, J., Zhang, Y., Wu, F., Liu, T.-Y.: De novo molecular generation via connection-aware motif mining. International Conference on Learning Representations (2023) Xu et al. [2018] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.-i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462 (2018). PMLR Zhu et al. [2020] Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems 33, 7793–7804 (2020) Wu et al. [2021] Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Wu, Z., Jain, P., Wright, M., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems 34, 13266–13279 (2021) Li et al. [2018] Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018) Chen et al. [2020] Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020) Alon and Yahav [2021] Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Alon, U., Yahav, E.: On the bottleneck of graph neural networks and its practical implications. International Conference on Learning Representations (ICLR) (2021) Ying et al. [2021] Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., Liu, T.-Y.: Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34, 28877–28888 (2021) Chen et al. [2020] Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735 (2020). PMLR Landrum et al. [2013] Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Landrum, G., et al.: Rdkit: A software suite for cheminformatics, computational chemistry, and predictive modeling. Greg Landrum 8 (2013) Xu et al. [2018] Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018) Bemis and Murcko [1996] Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Bemis, G.W., Murcko, M.A.: The properties of known drugs. 1. molecular frameworks. Journal of medicinal chemistry 39(15), 2887–2893 (1996) Xu et al. [2021] Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Xu, P., Kumar, D., Yang, W., Zi, W., Tang, K., Huang, C., Cheung, J.C.K., Prince, S.J., Cao, Y.: Optimizing deeper transformers on small datasets. Association for Computational Liguistics (ACL) (2021) Wang et al. [2019] Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019) Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)
- Wang, M., Zheng, D., Ye, Z., Gan, Q., Li, M., Song, X., Zhou, J., Ma, C., Yu, L., Gai, Y., et al.: Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315 (2019)