Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

NVDiff: Graph Generation through the Diffusion of Node Vectors (2211.10794v2)

Published 19 Nov 2022 in cs.LG

Abstract: Learning to generate graphs is challenging as a graph is a set of pairwise connected, unordered nodes encoding complex combinatorial structures. Recently, several works have proposed graph generative models based on normalizing flows or score-based diffusion models. However, these models need to generate nodes and edges in parallel from the same process, whose dimensionality is unnecessarily high. We propose NVDiff, which takes the VGAE structure and uses a score-based generative model (SGM) as a flexible prior to sample node vectors. By modeling only node vectors in the latent space, NVDiff significantly reduces the dimension of the diffusion process and thus improves sampling speed. Built on the NVDiff framework, we introduce an attention-based score network capable of capturing both local and global contexts of graphs. Experiments indicate that NVDiff significantly reduces computations and can model much larger graphs than competing methods. At the same time, it achieves superior or competitive performances over various datasets compared to previous methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. Aldous, D. J. 1981. Representations for partially exchangeable arrays of random variables. Journal of Multivariate Analysis, 11(4): 581–598.
  2. Anderson, B. D. 1982. Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3): 313–326.
  3. Edge-based sequential graph generation with recurrent neural networks. Neurocomputing, 416: 177–189.
  4. Why is Tanimoto index an appropriate choice for fingerprint-based similarity calculations? Journal of cheminformatics, 7(1): 1–13.
  5. Flow network based generative models for non-iterative diverse candidate generation. Advances in Neural Information Processing Systems, 34: 27381–27394.
  6. A model to search for synthesizable molecules. Advances in Neural Information Processing Systems, 32.
  7. WaveGrad: Estimating gradients for waveform generation. arXiv preprint arXiv:2009.00713.
  8. Order Matters: Probabilistic Modeling of Node Sequence for Graph Generation. arXiv preprint arXiv:2106.06189.
  9. Fast neighborhood subgraph pairwise distance kernel. In ICML.
  10. Diagnosing and enhancing VAE models. arXiv preprint arXiv:1903.05789.
  11. Scalable deep generative modeling for sparse graphs. In International Conference on Machine Learning, 2302–2312. PMLR.
  12. E (n) Equivariant Normalizing Flows. Advances in Neural Information Processing Systems, 34.
  13. GraphGen: a scalable approach to domain-agnostic labeled graph generation. In Proceedings of The Web Conference 2020, 1253–1263.
  14. A kernel two-sample test. The Journal of Machine Learning Research, 13(1): 723–773.
  15. Graphite: Iterative generative modeling of graphs. In International conference on machine learning, 2434–2444. PMLR.
  16. Node-edge co-disentangled representation learning for attributed graph generation. In International Conference on Knowledge Discovery and Data Mining (SIGKDD).
  17. Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems, 33: 6840–6851.
  18. Hoover, D. 1979. Relations on probability spaces and arrays of random variables. Preprint.
  19. ZINC: a free tool to discover chemistry for biology. Journal of chemical information and modeling, 52(7): 1757–1768.
  20. Junction tree variational autoencoder for molecular graph generation. In International conference on machine learning, 2323–2332. PMLR.
  21. Hierarchical generation of molecular graphs using structural motifs. In International Conference on Machine Learning, 4839–4848. PMLR.
  22. Score-based Generative Modeling of Graphs via the System of Stochastic Differential Equations. arXiv preprint arXiv:2202.02514.
  23. Kajino, H. 2019. Molecular hypergraph grammar with its application to molecular optimization. In International Conference on Machine Learning, 3183–3191. PMLR.
  24. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114.
  25. Variational graph auto-encoders. arXiv preprint arXiv:1611.07308.
  26. Learning hierarchical priors in vaes. Advances in neural information processing systems, 32.
  27. Dirichlet graph variational autoencoder. Advances in Neural Information Processing Systems, 33: 5274–5283.
  28. Learning deep generative models of graphs. arXiv preprint arXiv:1803.03324.
  29. Multi-objective de novo drug design with conditional graph generative model. Journal of cheminformatics, 10(1): 1–24.
  30. Efficient graph generation with graph recurrent attention networks. Advances in Neural Information Processing Systems, 32.
  31. Scaffold-based molecular design with a graph generative model. Chemical science, 11(4): 1153–1164.
  32. Categorical normalizing flows via continuous transformations. arXiv preprint arXiv:2006.09790.
  33. Graph normalizing flows. Advances in Neural Information Processing Systems, 32.
  34. Constrained graph variational autoencoders for molecule design. Advances in neural information processing systems, 31.
  35. GraphDF: A discrete flow model for molecular graph generation. In International Conference on Machine Learning, 7192–7203. PMLR.
  36. Graphnvp: An invertible flow model for generating molecular graphs. arXiv preprint arXiv:1905.11600.
  37. Stochastic blockmodels meet graph neural networks. In International Conference on Machine Learning, 4466–4474. PMLR.
  38. Symbolic music generation with diffusion models. arXiv preprint arXiv:2103.16091.
  39. Permutation invariant graph generation via score-based generative modeling. In International Conference on Artificial Intelligence and Statistics, 4474–4484. PMLR.
  40. Fréchet ChemNet distance: a metric for generative models for molecules in drug discovery. Journal of chemical information and modeling, 58(9): 1736–1741.
  41. Quantum chemistry structures and properties of 134 kilo molecules. Scientific data, 1(1): 1–7.
  42. Variational inference with normalizing flows. In International conference on machine learning, 1530–1538. PMLR.
  43. Nevae: A deep generative model for molecular graphs. Journal of machine learning research. 2020 Apr; 21 (114): 1-33.
  44. E (n) equivariant graph neural networks. In International Conference on Machine Learning, 9323–9332. PMLR.
  45. Graphaf: a flow-based autoregressive model for molecular graph generation. arXiv preprint arXiv:2001.09382.
  46. Graphvae: Towards generation of small graphs using variational autoencoders. In International conference on artificial neural networks, 412–422. Springer.
  47. D2C: Diffusion-Decoding Models for Few-Shot Conditional Generation. Advances in Neural Information Processing Systems, 34.
  48. Generative modeling by estimating gradients of the data distribution. Advances in Neural Information Processing Systems, 32.
  49. Score-based generative modeling through stochastic differential equations. arXiv preprint arXiv:2011.13456.
  50. On Evaluation Metrics for Graph Generative Models. arXiv preprint arXiv:2201.09871.
  51. DeepNC: Deep generative network completion. IEEE Transactions on Pattern Analysis and Machine Intelligence.
  52. Score-based generative modeling in latent space. Advances in Neural Information Processing Systems, 34.
  53. Visualizing data using t-SNE. Journal of machine learning research, 9(11).
  54. Collective dynamics of ‘small-world’networks. nature, 393(6684): 440–442.
  55. Diffusion priors in variational autoencoders. arXiv preprint arXiv:2106.15671.
  56. Learning neural generative dynamics for molecular conformation generation. arXiv preprint arXiv:2102.10240.
  57. Graphrnn: Generating realistic graphs with deep auto-regressive models. In International conference on machine learning, 5708–5717. PMLR.
  58. MoFlow: an invertible flow model for generating molecular graphs. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 617–626.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Xiaohui Chen (73 papers)
  2. Yukun Li (34 papers)
  3. Aonan Zhang (32 papers)
  4. Li-Ping Liu (27 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.