Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Representation Learning with an Information-theoretic Loss (2111.12950v5)

Published 25 Nov 2021 in cs.LG

Abstract: This paper proposes a deep representation learning using an information-theoretic loss with an aim to increase the inter-class distances as well as within-class similarity in the embedded space. Tasks such as anomaly and out-of-distribution detection, in which test samples comes from classes unseen in training, are problematic for deep neural networks. For such tasks, it is not sufficient to merely discriminate between known classes. Our intuition is to represent the known classes in compact and separated embedded regions in order to decrease the possibility of known and unseen classes overlapping in the embedded space. We derive a loss from Information Bottleneck principle, which reflects the inter-class distances as well as the compactness within classes, thus will extend the existing deep data description models. Our empirical study shows that the proposed model improves the segmentation of normal classes in the deep feature space, and subsequently contributes to identifying out-of-distribution samples.

Citations (2)

Summary

We haven't generated a summary for this paper yet.