Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

APo-VAE: Text Generation in Hyperbolic Space (2005.00054v3)

Published 30 Apr 2020 in cs.LG and stat.ML

Abstract: Natural language often exhibits inherent hierarchical structure ingrained with complex syntax and semantics. However, most state-of-the-art deep generative models learn embeddings only in Euclidean vector space, without accounting for this structural property of language. In this paper, we investigate text generation in a hyperbolic latent space to learn continuous hierarchical representations. An Adversarial Poincare Variational Autoencoder (APo-VAE) is presented, where both the prior and variational posterior of latent variables are defined over a Poincare ball via wrapped normal distributions. By adopting the primal-dual formulation of KL divergence, an adversarial learning procedure is introduced to empower robust model training. Extensive experiments in LLMing and dialog-response generation tasks demonstrate the winning effectiveness of the proposed APo-VAE model over VAEs in Euclidean latent space, thanks to its superb capabilities in capturing latent language hierarchies in hyperbolic space.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shuyang Dai (15 papers)
  2. Zhe Gan (135 papers)
  3. Yu Cheng (354 papers)
  4. Chenyang Tao (29 papers)
  5. Lawrence Carin (203 papers)
  6. Jingjing Liu (139 papers)
Citations (30)

Summary

We haven't generated a summary for this paper yet.