2000 character limit reached
Enhanced Variational Inference with Dyadic Transformation (1901.10621v2)
Published 30 Jan 2019 in cs.LG and stat.ML
Abstract: Variational autoencoder is a powerful deep generative model with variational inference. The practice of modeling latent variables in the VAE's original formulation as normal distributions with a diagonal covariance matrix limits the flexibility to match the true posterior distribution. We propose a new transformation, dyadic transformation (DT), that can model a multivariate normal distribution. DT is a single-stage transformation with low computational requirements. We demonstrate empirically on MNIST dataset that DT enhances the posterior flexibility and attains competitive results compared to other VAE enhancements.