Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

q-VAE for Disentangled Representation Learning and Latent Dynamical Systems (2003.01852v3)

Published 4 Mar 2020 in cs.LG, cs.RO, and stat.ML

Abstract: A variational autoencoder (VAE) derived from Tsallis statistics called q-VAE is proposed. In the proposed method, a standard VAE is employed to statistically extract latent space hidden in sampled data, and this latent space helps make robots controllable in feasible computational time and cost. To improve the usefulness of the latent space, this paper focuses on disentangled representation learning, e.g., $\beta$-VAE, which is the baseline for it. Starting from a Tsallis statistics perspective, a new lower bound for the proposed q-VAE is derived to maximize the likelihood of the sampled data, which can be considered an adaptive $\beta$-VAE with deformed Kullback-Leibler divergence. To verify the benefits of the proposed q-VAE, a benchmark task to extract the latent space from the MNIST dataset was performed. The results demonstrate that the proposed q-VAE improved disentangled representation while maintaining the reconstruction accuracy of the data. In addition, it relaxes the independency condition between data, which is demonstrated by learning the latent dynamics of nonlinear dynamical systems. By combining disentangled representation, the proposed q-VAE achieves stable and accurate long-term state prediction from the initial state and the action sequence. The dataset for hexapod walking is available on IEEE Dataport, doi: https://dx.doi.org/10.21227/99af-jw71.

Citations (16)

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com