Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Generation via Spectral Diffusion (2402.18974v1)

Published 29 Feb 2024 in cs.LG

Abstract: In this paper, we present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process. Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix. Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node. Using the Laplacian spectrum allows us to naturally capture the structural characteristics of the graph and work directly in the node space while avoiding the quadratic complexity bottleneck that limits the applicability of other methods. This is achieved by truncating the spectrum, which as we show in our experiments results in a faster yet accurate generative process. An extensive set of experiments on both synthetic and real world graphs demonstrates the strengths of our model against state-of-the-art alternatives.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (46)
  1. Statistical mechanics of complex networks. Reviews of modern physics, 74(1):47, 2002.
  2. A quantum jensen–shannon graph kernel for unattributed graphs. Pattern Recognition, 48(2):344–355, 2015.
  3. Isospectralization, or how to hear shape, style, and correspondence. In IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2019, Long Beach, CA, USA, June 16-20, 2019, pp. 7529–7538. Computer Vision Foundation / IEEE, 2019. doi: 10.1109/CVPR.2019.00771.
  4. Molgan: An implicit generative model for small molecular graphs. arXiv preprint arXiv:1805.11973, 2018.
  5. Distinguishing enzyme structures from non-enzymes without alignments. Journal of molecular biology, 330(4):771–783, 2003.
  6. On the evolution of random graphs. Publ. math. inst. hung. acad. sci, 5(1):17–60, 1960.
  7. Faloutsos, C. Graph mining: Laws, generators and tools. Lecture Notes in Computer Science, 5012:1, 2008.
  8. Feller, W. On the theory of stochastic processes, with particular reference to applications. In Proceedings of the [First] Berkeley Symposium on Mathematical Statistics and Probability, pp.  403–432, 1949.
  9. Automatic chemical design using a data-driven continuous representation of molecules. ACS central science, 4(2):268–276, 2018.
  10. Generative adversarial nets. Advances in neural information processing systems, 27, 2014.
  11. Graphite: Iterative generative modeling of graphs. In International conference on machine learning, pp. 2434–2444. PMLR, 2019.
  12. A systematic survey on deep generative models for graph generation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(5):5370–5390, 2022.
  13. Diffusion models for graphs benefit from discrete state spaces. arXiv preprint arXiv:2210.01549, 2022.
  14. Denoising diffusion probabilistic models. Advances in neural information processing systems, 33:6840–6851, 2020.
  15. Stable and informative spectral signatures for graph matching. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  2305–2312, 2014.
  16. Graphgdp: Generative diffusion processes for permutation invariant graph generation. In 2022 IEEE International Conference on Data Mining (ICDM), pp.  201–210. IEEE, 2022.
  17. Score-based generative modeling of graphs via the system of stochastic differential equations. In International Conference on Machine Learning, pp. 10362–10383. PMLR, 2022.
  18. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013.
  19. Gg-gan: A geometric graph generative adversarial network. 2020.
  20. Scalable modeling of real graphs using kronecker multiplication. In Proceedings of the 24th international conference on Machine learning, pp.  497–504, 2007.
  21. Kronecker graphs: an approach to modeling networks. Journal of Machine Learning Research, 11(2), 2010.
  22. Learning deep generative models of graphs. arXiv preprint arXiv:1803.03324, 2018a.
  23. Multi-objective de novo drug design with conditional graph generative model. Journal of cheminformatics, 10:1–24, 2018b.
  24. Efficient graph generation with graph recurrent attention networks. Advances in neural information processing systems, 32, 2019.
  25. Repaint: Inpainting using denoising diffusion probabilistic models. 2022.
  26. Fast graph generation via spectral diffusion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023.
  27. Graphdf: A discrete flow model for molecular graph generation. In International Conference on Machine Learning, pp. 7192–7203. PMLR, 2021.
  28. Spectral shape recovery and analysis via data-driven connections. Int. J. Comput. Vision, 129(10):2745–2760, oct 2021. ISSN 0920-5691. doi: 10.1007/s11263-021-01492-6.
  29. Provably powerful graph networks. In Wallach, H., Larochelle, H., Beygelzimer, A., d'Alché-Buc, F., Fox, E., and Garnett, R. (eds.), Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019. URL https://proceedings.neurips.cc/paper_files/paper/2019/file/bb04af0f7ecaee4aae62035497da1387-Paper.pdf.
  30. Spectre: Spectral conditioning helps to overcome the expressivity limits of one-shot graph generators. In International Conference on Machine Learning, pp. 15159–15179. PMLR, 2022.
  31. Permutation invariant graph generation via score-based generative modeling. In International Conference on Artificial Intelligence and Statistics, pp.  4474–4484. PMLR, 2020.
  32. Molecularrnn: Generating realistic molecular graphs with optimized properties. arXiv preprint arXiv:1905.13372, 2019.
  33. Quantum chemistry structures and properties of 134 kilo molecules. Scientific data, 1(1):1–7, 2014.
  34. Enumeration of 166 billion organic small molecules in the chemical universe database gdb-17. Journal of chemical information and modeling, 52(11):2864–2875, 2012.
  35. Nevae: A deep generative model for molecular graphs. Journal of machine learning research, 21(114):1–33, 2020.
  36. Graphaf: a flow-based autoregressive model for molecular graph generation. arXiv preprint arXiv:2001.09382, 2020.
  37. Graphvae: Towards generation of small graphs using variational autoencoders. In Artificial Neural Networks and Machine Learning–ICANN 2018: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, October 4-7, 2018, Proceedings, Part I 27, pp.  412–422. Springer, 2018.
  38. Deep unsupervised learning using nonequilibrium thermodynamics. In International conference on machine learning, pp. 2256–2265. PMLR, 2015.
  39. Generative modeling by estimating gradients of the data distribution. Advances in neural information processing systems, 32, 2019.
  40. Attention is all you need. In Guyon, I., Luxburg, U. V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., and Garnett, R. (eds.), Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf.
  41. Digress: Discrete denoising diffusion for graph generation. arXiv preprint arXiv:2209.14734, 2022.
  42. Collective dynamics of ‘small-world’networks. nature, 393(6684):440–442, 1998.
  43. Graph generative model for benchmarking graph neural networks. 2023.
  44. Graph convolutional policy network for goal-directed molecular graph generation. Advances in neural information processing systems, 31, 2018a.
  45. Graphrnn: Generating realistic graphs with deep auto-regressive models. In International conference on machine learning, pp. 5708–5717. PMLR, 2018b.
  46. Recurrent neural network regularization. arXiv preprint arXiv:1409.2329, 2014.
Citations (1)

Summary

We haven't generated a summary for this paper yet.