Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient and Scalable Graph Generation through Iterative Local Expansion (2312.11529v4)

Published 14 Dec 2023 in cs.SI and cs.LG

Abstract: In the realm of generative models for graphs, extensive research has been conducted. However, most existing methods struggle with large graphs due to the complexity of representing the entire joint distribution across all node pairs and capturing both global and local graph structures simultaneously. To overcome these issues, we introduce a method that generates a graph by progressively expanding a single node to a target graph. In each step, nodes and edges are added in a localized manner through denoising diffusion, building first the global structure, and then refining the local details. The local generation avoids modeling the entire joint distribution over all node pairs, achieving substantial computational savings with subquadratic runtime relative to node count while maintaining high expressivity through multiscale generation. Our experiments show that our model achieves state-of-the-art performance on well-established benchmark datasets while successfully scaling to graphs with at least 5000 nodes. Our method is also the first to successfully extrapolate to graphs outside of the training distribution, showcasing a much better generalization capability over existing methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (72)
  1. Musiclm: Generating music from text. arXiv preprint arXiv:2301.11325, 2023.
  2. Statistical mechanics of complex networks. Rev. Mod. Phys., 74:47–97, 01 2002.
  3. Brian D.O. Anderson. Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3):313–326, 1982. ISSN 0304-4149.
  4. Diffusion visual counterfactual explanations. Advances in Neural Information Processing Systems, 35:364–377, 2022.
  5. Structured denoising diffusion models in discrete state-spaces. CoRR, 2107.03006, 2021.
  6. Layer normalization, 2016.
  7. Discovering graph generation algorithms. arXiv preprint arXiv:2304.12895, 2023.
  8. Learning to execute programs with instruction pointer attention graph neural networks. ArXiv, 2010.12621, 2020.
  9. Fast unfolding of communities in large networks. Journal of Statistical Mechanics: Theory and Experiment, 2008(10):P10008, oct 2008.
  10. Audiolm: a language modeling approach to audio generation. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2023.
  11. Molgan: An implicit generative model for small molecular graphs, 2022.
  12. Analog bits: Generating discrete data using diffusion models with self-conditioning, 2023a.
  13. Efficient and degree-guided graph generation via discrete diffusion modeling. ArXiv, 2305.04111, 2023b.
  14. E. Cuthill and J. McKee. Reducing the bandwidth of sparse symmetric matrices. In Proceedings of the 1969 24th National Conference, ACM ’69, pp.  157–172, New York, NY, USA, 1969. Association for Computing Machinery. ISBN 9781450374934.
  15. Scalable deep generative modeling for sparse graphs. In Hal Daumé III and Aarti Singh (eds.), Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pp.  2302–2312. PMLR, 13–18 Jul 2020.
  16. Size matters: Large graph generation with higgs, 2023.
  17. Improving graph generation by restricting graph bandwidth, 2023.
  18. Distinguishing enzyme structures from non-enzymes without alignments. Journal of molecular biology, 330:771–83, 08 2003.
  19. Unicats: A unified context-aware text-to-speech framework with contextual vq-diffusion and vocoding. arXiv preprint arXiv:2306.07547, 2023.
  20. A generalization of transformer networks to graphs, 2021.
  21. On the evolution of random graphs. Publ. Math. Inst. Hung. Acad. Sci, 5(1):17–60, 1960.
  22. Graph neural networks for social recommendation. The World Wide Web Conference, 2019.
  23. Fast graph representation learning with pytorch geometric, 2019.
  24. Automating rigid origami design. In Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, pp.  5815–5823, 8 2023. AI and Arts.
  25. Graphgen: A scalable approach to domain-agnostic labeled graph generation. In Proceedings of The Web Conference 2020. ACM, 2020.
  26. Graphite: Iterative generative modeling of graphs, 2019.
  27. Vector quantized diffusion model for text-to-image synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.  10696–10706, 2022.
  28. An unpooling layer for graph generation, 2023.
  29. Diffusion models for graphs benefit from discrete state spaces, 2023.
  30. Exploring network structure, dynamics, and function using networkx. In Gaël Varoquaux, Travis Vaught, and Jarrod Millman (eds.), Proceedings of the 7th Python in Science Conference, pp.  11 – 15, Pasadena, CA USA, 2008.
  31. A unifying framework for spectrum-preserving graph sparsification and coarsening. ArXiv, 1902.09702, 2019.
  32. Denoising diffusion probabilistic models, 2020.
  33. Stochastic blockmodels: First steps. Social Networks, 5(2):109–137, 1983. ISSN 0378-8733.
  34. Argmax flows and multinomial diffusion: Learning categorical distributions. Advances in Neural Information Processing Systems, 34:12454–12465, 2021.
  35. Strategies for pre-training graph neural networks, 2020.
  36. Illuminating protein space with a programmable generative model. bioRxiv, 2022.
  37. Junction tree variational autoencoder for molecular graph generation. In Jennifer Dy and Andreas Krause (eds.), Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pp.  2323–2332. PMLR, 10–15 Jul 2018.
  38. Hierarchical generation of molecular graphs using structural motifs. In International Conference on Machine Learning, 2020a.
  39. Graph coarsening with preserved spectral properties. In Silvia Chiappa and Roberto Calandra (eds.), Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, volume 108 of Proceedings of Machine Learning Research, pp.  4452–4462. PMLR, 26–28 Aug 2020b.
  40. Graph generation with destination-predicting diffusion mixture. 2023.
  41. Mahdi Karami. Higen: Hierarchical graph generative networks, 2023.
  42. Elucidating the design space of diffusion-based generative models, 2022.
  43. Adam: A method for stochastic optimization, 2017.
  44. Autoregressive diffusion model for graph generation, 2023.
  45. A unified framework for optimization-based graph coarsening. J. Mach. Learn. Res., 24:118:1–118:50, 2022.
  46. Featured graph coarsening with similarity guarantees. In International Conference on Machine Learning, 2023.
  47. Molgrow: A graph normalizing flow for hierarchical molecular generation, 2021.
  48. Efficient graph generation with graph recurrent attention networks, 2020.
  49. Sign and basis invariant networks for spectral graph representation learning, 2022.
  50. Sagess: Sampling graph denoising diffusion model for scalable graph generation, 2023.
  51. Graph normalizing flows, 2019.
  52. Andreas Loukas. Graph reduction with spectral and cut guarantees. J. Mach. Learn. Res., 20:116:1–116:42, 2018.
  53. Spectrally approximating large graphs with smaller graphs. ArXiv, 1802.07510, 2018.
  54. Provably powerful graph networks, 2020.
  55. Spectre: Spectral conditioning helps to overcome the expressivity limits of one-shot graph generators, 2022.
  56. Abdiffuser: Full-atom generation of in-vitro functioning antibodies. ArXiv, 2308.05027, 2023.
  57. Graph kernels for object category prediction in task-dependent robot grasping. In Mining and Learning with Graphs, 2013.
  58. Graph-based pattern-oriented, context-sensitive source code completion. 2012 34th International Conference on Software Engineering (ICSE), pp.  69–79, 2012.
  59. Permutation invariant graph generation via score-based generative modeling, 2020.
  60. Td-gen: Graph generation with tree decomposition, 2022.
  61. Graphvae: Towards generation of small graphs using variational autoencoders, 2018.
  62. Deep unsupervised learning using nonequilibrium thermodynamics, 2015.
  63. Score-based generative modeling through stochastic differential equations, 2021.
  64. Improved vector quantized diffusion models. arXiv preprint arXiv:2205.16007, 2022.
  65. Attention is all you need, 2023.
  66. Digress: Discrete denoising diffusion for graph generation, 2023a.
  67. Midi: Mixed graph and 3d denoising diffusion for molecule generation. In ECML/PKDD, 2023b.
  68. Pascal Vincent. A connection between score matching and denoising autoencoders. Neural Computation, 23(7):1661–1674, 2011.
  69. Nisheeth K. Vishnoi. Lx = b. Foundations and Trends® in Theoretical Computer Science, 8(1–2):1–141, 2013. ISSN 1551-305X.
  70. How powerful are graph neural networks?, 2019.
  71. Swingnn: Rethinking permutation invariance in diffusion models for graph generation, 2023.
  72. Graphrnn: Generating realistic graphs with deep auto-regressive models, 2018.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Andreas Bergmeister (3 papers)
  2. Karolis Martinkus (12 papers)
  3. Nathanaël Perraudin (38 papers)
  4. Roger Wattenhofer (212 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com