Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Deconvolutional Generation (2002.07087v1)

Published 14 Feb 2020 in cs.LG and stat.ML

Abstract: Graph generation is an extremely important task, as graphs are found throughout different areas of science and engineering. In this work, we focus on the modern equivalent of the Erdos-Renyi random graph model: the graph variational autoencoder (GVAE). This model assumes edges and nodes are independent in order to generate entire graphs at a time using a multi-layer perceptron decoder. As a result of these assumptions, GVAE has difficulty matching the training distribution and relies on an expensive graph matching procedure. We improve this class of models by building a message passing neural network into GVAE's encoder and decoder. We demonstrate our model on the specific task of generating small organic molecules

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Daniel Flam-Shepherd (9 papers)
  2. Tony Wu (11 papers)
  3. Alan Aspuru-Guzik (61 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.