Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
138 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
4 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

GLAD: Improving Latent Graph Generative Modeling with Simple Quantization (2403.16883v5)

Published 25 Mar 2024 in cs.LG and stat.ML

Abstract: Learning graph generative models over latent spaces has received less attention compared to models that operate on the original data space and has so far demonstrated lacklustre performance. We present GLAD a latent space graph generative model. Unlike most previous latent space graph generative models, GLAD operates on a discrete latent space that preserves to a significant extent the discrete nature of the graph structures making no unnatural assumptions such as latent space continuity. We learn the prior of our discrete latent space by adapting diffusion bridges to its structure. By operating over an appropriately constructed latent space we avoid relying on decompositions that are often used in models that operate in the original data space. We present experiments on a series of graph benchmark datasets that demonstrates GLAD as the first equivariant latent graph generative method achieves competitive performance with the state of the art baselines.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Statistical mechanics of complex networks. Reviews of modern physics, 74(1):47, 2002.
  2. Estimating or propagating gradients through stochastic neurons for conditional computation. arXiv preprint arXiv:1308.3432, 2013.
  3. Vector-quantized graph auto-encoder. arXiv preprint arXiv:2306.07735, 2023.
  4. Order matters: Probabilistic modeling of node sequence for graph generation. In Meila, M. and Zhang, T. (eds.), Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, pp.  1630–1639. PMLR, 18–24 Jul 2021.
  5. Classical potential theory and its probabilistic counterpart, volume 262. Springer, 1984.
  6. On the evolution of random graphs. Publ. math. inst. hung. acad. sci, 5(1):17–60, 1960.
  7. Simulating diffusion bridges with score matching. arXiv preprint arXiv:2111.07243, 2021.
  8. Zinc: a free tool to discover chemistry for biology. Journal of chemical information and modeling, 52(7):1757–1768, 2012.
  9. Junction tree variational autoencoder for molecular graph generation. In International conference on machine learning, pp.  2323–2332. PMLR, 2018.
  10. Score-based generative modeling of graphs via the system of stochastic differential equations. In International Conference on Machine Learning, pp.  10362–10383. PMLR, 2022.
  11. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013.
  12. Autoregressive diffusion model for graph generation. In International Conference on Machine Learning, pp.  17391–17408. PMLR, 2023.
  13. Landrum, G. et al. Rdkit: Open-source cheminformatics software, 2016. URL http://www. rdkit. org/, https://github. com/rdkit/rdkit, 149(150):650, 2016.
  14. Lejay, A. The girsanov theorem without (so much) stochastic analysis. Séminaire de Probabilités XLIX, pp.  329–361, 2018.
  15. Graph normalizing flows. Advances in Neural Information Processing Systems, 32, 2019.
  16. Constrained graph variational autoencoders for molecule design. Advances in neural information processing systems, 31, 2018.
  17. Learning diffusion bridges on constrained domains. In The Eleventh International Conference on Learning Representations, 2023.
  18. Graphdf: A discrete flow model for molecular graph generation. In International Conference on Machine Learning, pp.  7192–7203. PMLR, 2021.
  19. Finite scalar quantization: Vq-vae made simple. arXiv preprint arXiv:2309.15505, 2023.
  20. Permutation invariant graph generation via score-based generative modeling. In International Conference on Artificial Intelligence and Statistics, pp.  4474–4484. PMLR, 2020.
  21. Oksendal, B. Stochastic differential equations: an introduction with applications. Springer Science & Business Media, 2013.
  22. Peluchetti, S. Non-denoising forward-time diffusions, 2022. URL https://openreview.net/forum?id=oVfIKuhqfC.
  23. Quantum chemistry structures and properties of 134 kilo molecules. Scientific data, 1(1):1–7, 2014.
  24. Nevae: A deep generative model for molecular graphs. Journal of machine learning research, 21(114):1–33, 2020.
  25. Brenda, the enzyme database: updates and major new developments. Nucleic acids research, 32(suppl_1):D431–D433, 2004.
  26. Collective classification in network data. AI magazine, 29(3):93–93, 2008.
  27. Graphaf: a flow-based autoregressive model for molecular graph generation. arXiv preprint arXiv:2001.09382, 2020.
  28. Graphvae: Towards generation of small graphs using variational autoencoders. In Artificial Neural Networks and Machine Learning–ICANN 2018: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, October 4-7, 2018, Proceedings, Part I 27, pp.  412–422. Springer, 2018.
  29. Digress: Discrete denoising diffusion for graph generation. In The Eleventh International Conference on Learning Representations, 2023.
  30. Graphgan: Graph representation learning with generative adversarial nets. In Proceedings of the AAAI conference on artificial intelligence, 2018.
  31. Graphrnn: Generating realistic graphs with deep auto-regressive models. In International conference on machine learning, pp.  5708–5717. PMLR, 2018a.
  32. Graphrnn: Generating realistic graphs with deep auto-regressive models. In International conference on machine learning, pp.  5708–5717. PMLR, 2018b.
  33. Moflow: an invertible flow model for generating molecular graphs. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp.  617–626, 2020.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com