Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Predicted Embedding Power Regression for Large-Scale Out-of-Distribution Detection (2303.04115v2)

Published 7 Mar 2023 in cs.CV and cs.LG

Abstract: Out-of-distribution (OOD) inputs can compromise the performance and safety of real world machine learning systems. While many methods exist for OOD detection and work well on small scale datasets with lower resolution and few classes, few methods have been developed for large-scale OOD detection. Existing large-scale methods generally depend on maximum classification probability, such as the state-of-the-art grouped softmax method. In this work, we develop a novel approach that calculates the probability of the predicted class label based on label distributions learned during the training process. Our method performs better than current state-of-the-art methods with only a negligible increase in compute cost. We evaluate our method against contemporary methods across $14$ datasets and achieve a statistically significant improvement with respect to AUROC (84.2 vs 82.4) and AUPR (96.2 vs 93.7).

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Hong Yang (78 papers)
  2. William Gebhardt (4 papers)
  3. Alexander G. Ororbia (15 papers)
  4. Travis Desell (29 papers)

Summary

We haven't generated a summary for this paper yet.