Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Simple Probabilistic Method for Deep Classification under Input-Dependent Label Noise (2003.06778v3)

Published 15 Mar 2020 in cs.LG and stat.ML

Abstract: Datasets with noisy labels are a common occurrence in practical applications of classification methods. We propose a simple probabilistic method for training deep classifiers under input-dependent (heteroscedastic) label noise. We assume an underlying heteroscedastic generative process for noisy labels. To make gradient based training feasible we use a temperature parameterized softmax as a smooth approximation to the assumed generative process. We illustrate that the softmax temperature controls a bias-variance trade-off for the approximation. By tuning the softmax temperature, we improve accuracy, log-likelihood and calibration on both image classification benchmarks with controlled label noise as well as Imagenet-21k which has naturally occurring label noise. For image segmentation, our method increases the mean IoU on the PASCAL VOC and Cityscapes datasets by more than 1% over the state-of-the-art model.

Summary

We haven't generated a summary for this paper yet.