Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GraphRCG: Self-Conditioned Graph Generation (2403.01071v2)

Published 2 Mar 2024 in cs.LG and cs.AI

Abstract: Graph generation generally aims to create new graphs that closely align with a specific graph distribution. Existing works often implicitly capture this distribution through the optimization of generators, potentially overlooking the intricacies of the distribution itself. Furthermore, these approaches generally neglect the insights offered by the learned distribution for graph generation. In contrast, in this work, we propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions and employ these distributions to guide the generation process. We first perform self-conditioned modeling to capture the graph distributions by transforming each graph sample into a low-dimensional representation and optimizing a representation generator to create new representations reflective of the learned distribution. Subsequently, we leverage these bootstrapped representations as self-conditioned guidance for the generation process, thereby facilitating the generation of graphs that more accurately reflect the learned distributions. We conduct extensive experiments on generic and molecular graph datasets across various fields. Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. Spanning tree-based graph generation for molecules. In International Conference on Learning Representations.
  2. Structured denoising diffusion models in discrete state-spaces. Advances in Neural Information Processing Systems 34 (2021), 17981–17993.
  3. Analog Bits: Generating Discrete Data using Diffusion Models with Self-Conditioning. In The Eleventh International Conference on Learning Representations.
  4. Efficient and Degree-Guided Graph Generation via Discrete Diffusion Modeling. arXiv preprint arXiv:2305.04111 (2023).
  5. Fabrizio Costa and Kurt De Grave. 2010. Fast neighborhood subgraph pairwise distance kernel. In Proceedings of the 26th International Conference on Machine Learning. Omnipress; Madison, WI, USA, 255–262.
  6. Nicola De Cao and Thomas Kipf. 2018. MolGAN: An implicit generative model for small molecular graphs. arXiv:1805.11973 (2018).
  7. Prafulla Dhariwal and Alexander Nichol. 2021. Diffusion models beat gans on image synthesis. Advances in neural information processing systems 34 (2021), 8780–8794.
  8. ChemSpacE: Interpretable and Interactive Chemical Space Exploration. Transactions on Machine Learning Research (2022).
  9. On the evolution of random graphs. Publ. math. inst. hung. acad. sci 5, 1 (1960), 17–60.
  10. Multi-motifgan (mmgan): Motif-targeted graph generation and prediction. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 4182–4186.
  11. CiteSeer: An automatic citation indexing system. In Proceedings of the third ACM conference on Digital libraries. 89–98.
  12. Xavier Glorot and Yoshua Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics.
  13. Generating tertiary protein structures via interpretable graph variational autoencoders. Bioinformatics Advances 1, 1 (2021), vbab036.
  14. Property controllable variational autoencoder via invertible mutual dependence. In International Conference on Learning Representations.
  15. Denoising diffusion probabilistic models. NeurIPS (2020).
  16. Cascaded diffusion models for high fidelity image generation. The Journal of Machine Learning Research 23, 1 (2022), 2249–2281.
  17. Conditional Diffusion Based on Discrete Graph Structures for Molecular Graph Generation. In NeurIPS 2022 Workshop on Score-Based Methods.
  18. Illuminating protein space with a programmable generative model. Nature 623, 7989 (2023), 1070–1078.
  19. ZINC: a free tool to discover chemistry for biology. Journal of chemical information and modeling 52, 7 (2012), 1757–1768.
  20. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE conference on computer vision and pattern recognition. 1125–1134.
  21. Score-based generative modeling of graphs via the system of stochastic differential equations. In International Conference on Machine Learning. PMLR, 10362–10383.
  22. Mahdi Karami. 2023. HiGen: Hierarchical Graph Generative Networks. arXiv preprint arXiv:2305.19337 (2023).
  23. Diederik P Kingma and Jimmy Ba. 2015. Adam: A method for stochastic optimization. In Proceedings of the 2015 International Conference on Learning Representations.
  24. Autoregressive diffusion model for graph generation. In International Conference on Machine Learning. PMLR, 17391–17408.
  25. Zhifeng Kong and Wei Ping. 2021. On Fast Sampling of Diffusion Probabilistic Models. In ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models.
  26. Exploring chemical space with score-based out-of-distribution generation. In International Conference on Machine Learning. PMLR, 18872–18892.
  27. Self-conditioned Image Generation via Generating Representations. arXiv preprint arXiv:2312.03701 (2023).
  28. Diffusion-lm improves controllable text generation. Advances in Neural Information Processing Systems 35 (2022), 4328–4343.
  29. Learning deep generative models of graphs. arXiv preprint arXiv:1803.03324 (2018).
  30. Efficient graph generation with graph recurrent attention networks. Advances in neural information processing systems 32 (2019).
  31. Juan Miguel Lopez Alcaraz and Nils Strodthoff. 2023. Diffusion-based time series imputation and forecasting with structured atate apace models. Transactions on machine learning research (2023), 1–36.
  32. Graphdf: A discrete flow model for molecular graph generation. In International Conference on Machine Learning. PMLR, 7192–7203.
  33. Spectre: Spectral conditioning helps to overcome the expressivity limits of one-shot graph generators. In International Conference on Machine Learning. PMLR, 15159–15179.
  34. Permutation invariant graph generation via score-based generative modeling. In International Conference on Artificial Intelligence and Statistics. PMLR, 4474–4484.
  35. Fréchet ChemNet distance: a metric for generative models for molecules in drug discovery. Journal of Chemical Information and Modeling (2018).
  36. Sparse Training of Discrete Diffusion Models for Graph Generation. arXiv preprint arXiv:2311.02142 (2023).
  37. Generative adversarial text to image synthesis. In International conference on machine learning. PMLR, 1060–1069.
  38. High-resolution image synthesis with latent diffusion models. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 10684–10695.
  39. U-net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18. Springer, 234–241.
  40. Noise estimation for generative diffusion models. arXiv preprint arXiv:2104.02600 (2021).
  41. Collective classification in network data. AI magazine 29, 3 (2008), 93–93.
  42. GraphAF: a Flow-based Autoregressive Model for Molecular Graph Generation. In International Conference on Learning Representations.
  43. Masked label prediction: Unified message passing model for semi-supervised classification. arXiv preprint arXiv:2009.03509 (2020).
  44. Deep unsupervised learning using nonequilibrium thermodynamics. In ICML.
  45. Denoising Diffusion Implicit Models. In ICLR.
  46. Score-Based Generative Modeling through Stochastic Differential Equations. In International Conference on Learning Representations.
  47. Csdi: Conditional score-based diffusion models for probabilistic time series imputation. Advances in Neural Information Processing Systems 34 (2021), 24804–24816.
  48. Score-based generative modeling in latent space. Advances in Neural Information Processing Systems 34 (2021), 11287–11302.
  49. Attention is all you need. In Advances in Neural Information Processing Systems.
  50. Graph attention networks. In ICLR.
  51. DiGress: Discrete Denoising diffusion for graph generation. In The Eleventh International Conference on Learning Representations.
  52. Deep generative model for periodic graphs. Advances in Neural Information Processing Systems 35 (2022).
  53. MoleculeNet: a benchmark for molecular machine learning. Chemical science 9, 2 (2018), 513–530.
  54. Diffsound: Discrete diffusion model for text-to-sound generation. IEEE/ACM Transactions on Audio, Speech, and Language Processing (2023).
  55. Graph convolutional policy network for goal-directed molecular graph generation. Advances in neural information processing systems 31 (2018).
  56. Graphrnn: Generating realistic graphs with deep auto-regressive models. In ICML.
  57. James Jian Qiao Yu and Jiatao Gu. 2019. Real-time traffic speed estimation with graph convolutional generative autoencoder. IEEE Transactions on Intelligent Transportation Systems 20, 10 (2019), 3940–3951.
  58. Chengxi Zang and Fei Wang. 2020. Moflow: an invertible flow model for generating molecular graphs. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining. 617–626.
  59. A survey on deep graph generation: Methods and applications. In Learning on Graphs Conference. PMLR, 47–1.

Summary

We haven't generated a summary for this paper yet.