2000 character limit reached
Preventing posterior collapse in variational autoencoders for text generation via decoder regularization
Published 28 Oct 2021 in cs.LG and cs.CL | (2110.14945v1)
Abstract: Variational autoencoders trained to minimize the reconstruction error are sensitive to the posterior collapse problem, that is the proposal posterior distribution is always equal to the prior. We propose a novel regularization method based on fraternal dropout to prevent posterior collapse. We evaluate our approach using several metrics and observe improvements in all the tested configurations.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.