ZClassifier: Temperature Tuning and Manifold Approximation via KL Divergence on Logit Space (2507.10638v1)
Abstract: We introduce a novel classification framework, ZClassifier, that replaces conventional deterministic logits with diagonal Gaussian-distributed logits. Our method simultaneously addresses temperature scaling and manifold approximation by minimizing the Kullback-Leibler (KL) divergence between the predicted Gaussian distributions and a unit isotropic Gaussian. This unifies uncertainty calibration and latent control in a principled probabilistic manner, enabling a natural interpretation of class confidence and geometric consistency. Experiments on CIFAR-10 and CIFAR-100 show that ZClassifier improves over softmax classifiers in robustness, calibration, and latent separation. We also demonstrate its effectiveness for classifier-guided generation by interpreting logits as Gaussian semantic potentials.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.