Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A graphon-signal analysis of graph neural networks (2305.15987v2)

Published 25 May 2023 in cs.LG

Abstract: We present an approach for analyzing message passing graph neural networks (MPNNs) based on an extension of graphon analysis to a so called graphon-signal analysis. A MPNN is a function that takes a graph and a signal on the graph (a graph-signal) and returns some value. Since the input space of MPNNs is non-Euclidean, i.e., graphs can be of any size and topology, properties such as generalization are less well understood for MPNNs than for Euclidean neural networks. We claim that one important missing ingredient in past work is a meaningful notion of graph-signal similarity measure, that endows the space of inputs to MPNNs with a regular structure. We present such a similarity measure, called the graphon-signal cut distance, which makes the space of all graph-signals a dense subset of a compact metric space -- the graphon-signal space. Informally, two deterministic graph-signals are close in cut distance if they ``look like'' they were sampled from the same random graph-signal model. Hence, our cut distance is a natural notion of graph-signal similarity, which allows comparing any pair of graph-signals of any size and topology. We prove that MPNNs are Lipschitz continuous functions over the graphon-signal metric space. We then give two applications of this result: 1) a generalization bound for MPNNs, and, 2) the stability of MPNNs to subsampling of graph-signals. Our results apply to any regular enough MPNN on any distribution of graph-signals, making the analysis rather universal.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. Random sampling and approximation of max-csps. Journal of Computer and System Sciences, 67(2):212–243, 2003. Special Issue on STOC 2002.
  2. Geometric deep learning on molecular representations. Nature Machine Intelligence, 3:1023–1032, 2021.
  3. W. Azizian and M. Lelarge. Expressive power of invariant and equivariant graph neural networks. In ICLR, 2021.
  4. Convergent sequences of dense graphs i: Subgraph frequencies, metric properties and testing. Advances in Mathematics, 219(6):1801–1851, 2008.
  5. FastGCN: Fast learning with graph convolutional networks via importance sampling. In International Conference on Learning Representations, 2018.
  6. On the equivalence between graph isomorphism testing and function approximation with gnns. In NeurIPS. Curran Associates, Inc., 2019.
  7. Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’19, page 257–266, New York, NY, USA, 2019. Association for Computing Machinery.
  8. D. Conlon and J. Fox. Bounds for graph regularity and removal lemmas. Geometric and Functional Analysis, 22:1191–1256, 2012.
  9. Convolutional neural networks on graphs with fast localized spectral filtering. In NeurIPS. Curran Associates Inc., 2016.
  10. J. M. S. et al. A deep learning approach to antibiotic discovery. Cell, 180(4):688 – 702.e13, 2020.
  11. M. Fey and J. E. Lenssen. Fast graph representation learning with PyTorch Geometric. In ICLR 2019 Workshop on Representation Learning on Graphs and Manifolds, 2019.
  12. G. B. Folland. Real analysis: modern techniques and their applications, volume 40. John Wiley & Sons, 1999.
  13. A. M. Frieze and R. Kannan. Quick approximation to matrices and applications. Combinatorica, 19:175–220, 1999.
  14. Generalization and representational limits of graph neural networks. In H. D. III and A. Singh, editors, ICML, volume 119 of Proceedings of Machine Learning Research, pages 3419–3430. PMLR, 13–18 Jul 2020.
  15. Neural message passing for quantum chemistry. In International Conference on Machine Learning, pages 1263–1272, 2017.
  16. Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems, page 1025–1035. Curran Associates Inc., 2017.
  17. Highly accurate protein structure prediction with alphafold. Nature, 596:583 – 589, 2021.
  18. Convergence and stability of graph convolutional networks on large random graphs. In Advances in Neural Information Processing Systems. Curran Associates, Inc., 2020.
  19. On the universality of graph neural networks on large random graphs. In NeurIPS. Curran Associates, Inc., 2021.
  20. T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
  21. Transferability of spectral graph convolutional neural networks. Journal of Machine Learning Research, 22(272):1–59, 2021.
  22. Cayleynets: Graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 67(1):97–109, 2019.
  23. A PAC-bayesian approach to generalization bounds for graph neural networks. In ICLR, 2021.
  24. L. M. Lovász. Large networks and graph limits. In volume 60 of Colloquium Publications, 2012.
  25. L. M. Lovász and B. Szegedy. Szemerédi’s lemma for the analyst. GAFA Geometric And Functional Analysis, 17:252–270, 2007.
  26. Transferability of graph neural networks: An extended graphon approach. Applied and Computational Harmonic Analysis, 63:48–83, 2023.
  27. Generalization analysis of message passing neural networks on large random graphs. In NeurIPS. Curran Associates, Inc., 2022.
  28. A geometric deep learning approach to predict binding conformations of bioactive molecules. Nature Machine Intelligence, 3:1033–1039, 2021.
  29. Wl meet vc. In ICML. PMLR, 2023.
  30. Weisfeiler and leman go neural: Higher-order graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01):4602–4609, Jul. 2019.
  31. Graphon signal processing. IEEE Transactions on Signal Processing, 69:4961–4976, 2021.
  32. Graphon and graph neural network stability. In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 5255–5259, 2021.
  33. The vapnik–chervonenkis dimension of graph and recursive neural networks. Neural Networks, 108:248–259, 2018.
  34. S. Shalev-Shwartz and S. Ben-David. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.
  35. D. Williams. Probability with Martingales. Cambridge University Press, 1991.
  36. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
Citations (14)

Summary

We haven't generated a summary for this paper yet.