Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

OAL: Enhancing OOD Detection Using Latent Diffusion (2406.16525v2)

Published 24 Jun 2024 in stat.ML and cs.LG

Abstract: Numerous Out-of-Distribution (OOD) detection algorithms have been developed to identify unknown samples or objects in real-world model deployments. Outlier Exposure (OE) algorithms, a subset of these methods, typically employ auxiliary datasets to train OOD detectors, enhancing the reliability of their predictions. While previous methods have leveraged Stable Diffusion (SD) to generate pixel-space outliers, these can complicate network optimization. We propose an Outlier Aware Learning (OAL) framework, which synthesizes OOD training data directly in the latent space. To regularize the model's decision boundary, we introduce a mutual information-based contrastive learning approach that amplifies the distinction between In-Distribution (ID) and collected OOD features. The efficacy of this contrastive learning technique is supported by both theoretical analysis and empirical results. Furthermore, we integrate knowledge distillation into our framework to preserve in-distribution classification accuracy. The combined application of contrastive learning and knowledge distillation substantially improves OOD detection performance, enabling OAL to outperform other OE methods by a considerable margin. Source code is available at: \url{https://github.com/HengGao12/OAL}.

Summary

We haven't generated a summary for this paper yet.