Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
91 tokens/sec
Gemini 2.5 Pro Premium
42 tokens/sec
GPT-5 Medium
18 tokens/sec
GPT-5 High Premium
12 tokens/sec
GPT-4o
92 tokens/sec
DeepSeek R1 via Azure Premium
92 tokens/sec
GPT OSS 120B via Groq Premium
480 tokens/sec
Kimi K2 via Groq Premium
195 tokens/sec
2000 character limit reached

A Classification Supervised Auto-Encoder Based on Predefined Evenly-Distributed Class Centroids (1902.00220v3)

Published 1 Feb 2019 in cs.CV

Abstract: Classic variational autoencoders are used to learn complex data distributions, that are built on standard function approximators. Especially, VAE has shown promise on a lot of complex task. In this paper, a new autoencoder model - classification supervised autoencoder (CSAE) based on predefined evenly-distributed class centroids (PEDCC) is proposed. Our method uses PEDCC of latent variables to train the network to ensure the maximization of inter-class distance and the minimization of inner-class distance. Instead of learning mean/variance of latent variables distribution and taking reparameterization of VAE, latent variables of CSAE are directly used to classify and as input of decoder. In addition, a new loss function is proposed to combine the loss function of classification. Based on the basic structure of the universal autoencoder, we realized the comprehensive optimal results of encoding, decoding, classification, and good model generalization performance at the same time. Theoretical advantages are reflected in experimental results.

Citations (34)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (2)

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube