Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
116 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
55 tokens/sec
2000 character limit reached

On the Transformation of Latent Space in Autoencoders (1901.08479v2)

Published 24 Jan 2019 in cs.LG and stat.ML

Abstract: Noting the importance of the latent variables in inference and learning, we propose a novel framework for autoencoders based on the homeomorphic transformation of latent variables, which could reduce the distance between vectors in the transformed space, while preserving the topological properties of the original space, and investigate the effect of the latent space transformation on learning generative models and denoising corrupted data. The experimental results demonstrate that our generative and denoising models based on the proposed framework can provide better performance than conventional variational and denoising autoencoders due to the transformation, where we evaluate the performance of generative and denoising models in terms of the Hausdorff distance between the sets of training and processed i.e., either generated or denoised images, which can objectively measure their differences, as well as through direct comparison of the visual characteristics of the processed images.

Citations (5)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.