Estimating Noisy Class Posterior with Part-level Labels for Noisy Label Learning (2405.05714v2)
Abstract: In noisy label learning, estimating noisy class posteriors plays a fundamental role for developing consistent classifiers, as it forms the basis for estimating clean class posteriors and the transition matrix. Existing methods typically learn noisy class posteriors by training a classification model with noisy labels. However, when labels are incorrect, these models may be misled to overemphasize the feature parts that do not reflect the instance characteristics, resulting in significant errors in estimating noisy class posteriors. To address this issue, this paper proposes to augment the supervised information with part-level labels, encouraging the model to focus on and integrate richer information from various parts. Specifically, our method first partitions features into distinct parts by cropping instances, yielding part-level labels associated with these various parts. Subsequently, we introduce a novel single-to-multiple transition matrix to model the relationship between the noisy and part-level labels, which incorporates part-level labels into a classifier-consistent framework. Utilizing this framework with part-level labels, we can learn the noisy class posteriors more precisely by guiding the model to integrate information from various parts, ultimately improving the classification performance. Our method is theoretically sound, while experiments show that it is empirically effective in synthetic and real-world noisy benchmarks.
- Learning to detect objects in images via a sparse, part-based representation. IEEE transactions on pattern analysis and machine intelligence, 26(11):1475–1490, 2004.
- Irving Biederman. Recognition-by-components: a theory of human image understanding. Psychological review, 94(2):115, 1987.
- Instance-dependent label-noise learning with manifold-regularized transition matrix estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 16630–16639, 2022a.
- Class-dependent label-noise learning with cycle-consistency regularization. Advances in Neural Information Processing Systems, 35:11104–11116, 2022b.
- Learning with instance-dependent label noise: A sample sieve approach. arXiv preprint arXiv:2010.02347, 2020a.
- Learning with bounded instance and label-dependent label noise. In International Conference on Machine Learning, pages 1789–1799. PMLR, 2020b.
- Solving the multiple instance problem with axis-parallel rectangles. Artificial intelligence, 89(1-2):31–71, 1997.
- Instance-dependent noisy label learning via graphical modelling. In Proceedings of the IEEE/CVF winter conference on applications of computer vision, pages 2288–2298, 2023.
- On calibration of modern neural networks. In International conference on machine learning, pages 1321–1330. PMLR, 2017.
- Masking: A new perspective of noisy supervision. Advances in neural information processing systems, 31, 2018a.
- Co-teaching: Robust training of deep neural networks with extremely noisy labels. Advances in neural information processing systems, 31, 2018b.
- Identity mappings in deep residual networks. In Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part IV 14, pages 630–645. Springer, 2016.
- Using trusted data to train deep networks on labels corrupted by severe noise. Advances in neural information processing systems, 31, 2018.
- Deep learning of part-based representation of data using sparse autoencoders with nonnegativity constraints. IEEE transactions on neural networks and learning systems, 27(12):2486–2498, 2015.
- Uncertainty-aware learning against label noise on imbalanced datasets. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 6960–6969, 2022.
- Learning with neighbor consistency for noisy labels. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4672–4681, 2022.
- Unicon: Combating label noise through uniform selection and contrastive learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9676–9686, 2022.
- Robust active label correction. In International conference on artificial intelligence and statistics, pages 308–316. PMLR, 2018.
- Alex Krizhevsky et al. Learning multiple layers of features from tiny images. Technical report, 2009.
- Yann LeCun. The mnist database of handwritten digits. http://yann. lecun. com/exdb/mnist/, 1998.
- Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
- Dividemix: Learning with noisy labels as semi-supervised learning. In International Conference on Learning Representations, pages 1–13, 2019.
- Neighborhood collective estimation for noisy label identification and correction. In Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXIV, pages 128–145. Springer, 2022a.
- Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks. In International conference on artificial intelligence and statistics, pages 4313–4324. PMLR, 2020.
- Selective-supervised contrastive learning with noisy labels. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 316–325, 2022b.
- Estimating noise transition matrix with label correlations for noisy multi-label learning. Advances in Neural Information Processing Systems, 35:24184–24198, 2022c.
- Provably end-to-end label-noise learning without anchor points. In International Conference on Machine Learning, pages 6403–6413. PMLR, 2021.
- Classification with noisy labels by importance reweighting. IEEE Transactions on pattern analysis and machine intelligence, 38(3):447–461, 2015.
- Peer loss functions: Learning from noisy labels without knowing noise rates. In International conference on machine learning, pages 6226–6236. PMLR, 2020.
- Visual object recognition. Annual review of neuroscience, 19(1):577–621, 1996.
- Does label smoothing mitigate label noise? In International Conference on Machine Learning, pages 6448–6458. PMLR, 2020.
- Curriculum loss: Robust learning and generalization against label corruption. arXiv preprint arXiv:1905.10045, 2019.
- Decoupling" when to update" from" how to update". Advances in neural information processing systems, 30, 2017.
- Self: learning to filter noisy labels with self-ensembling. In International Conference on Learning Representations (ICLR), 2020.
- Zero-shot learning by convex combination of semantic embeddings. In 2nd International Conference on Learning Representations, ICLR 2014, 2014.
- Stephen E Palmer. Hierarchical structure in perceptual representation. Cognitive psychology, 9(4):441–474, 1977.
- Making deep neural networks robust to label noise: A loss correction approach. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1944–1952, 2017.
- Learning to reweight examples for robust deep learning. In International conference on machine learning, pages 4334–4343. PMLR, 2018.
- SELFIE: Refurbishing unclean samples for robust deep learning. In ICML, 2019.
- Joint optimization framework for learning with noisy labels. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 5552–5560, 2018.
- Shimon Ullman. High-level vision: Object recognition and visual cognition. Cell, 86:599–606, 1996.
- Recognition of objects and their component parts: responses of single units in the temporal cortex of the macaque. Cerebral Cortex, 4(5):509–522, 1994.
- Scalable penalized regression for noise detection in learning with noisy labels. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 346–355, 2022.
- Boundaryface: A mining framework with noise label self-correction for face recognition. In Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XIII, pages 91–106. Springer, 2022.
- Class2simi: A noise reduction perspective on learning with noisy labels. In International Conference on Machine Learning, pages 11285–11295. PMLR, 2021.
- Are anchor points really indispensable in label-noise learning? Advances in Neural Information Processing Systems, 32:1–12, 2019.
- Part-dependent label noise: Towards instance-dependent label noise. Advances in Neural Information Processing Systems, 33:7597–7610, 2020.
- Sample selection with uncertainty of losses for learning with noisy labels. arXiv preprint arXiv:2106.00445, 2021.
- Extended t: Learning with mixed closed-set and open-set noisy labels. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
- Learning from massive noisy labeled data for image classification. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 2691–2699, 2015.
- L_dmi: A novel information-theoretic loss function for training deep nets robust to label noise. Advances in neural information processing systems, 32, 2019.
- Estimating instance-dependent bayes-label transition matrix using a deep neural network. In International Conference on Machine Learning, pages 25302–25312. PMLR, 2022.
- Searching to exploit memorization effect in learning with noisy labels. In International Conference on Machine Learning, pages 10789–10798. PMLR, 2020a.
- Dual t: Reducing estimation error for transition matrix in label-noise learning. Advances in neural information processing systems, 33:7260–7271, 2020b.
- Instance-dependent label-noise learning under a structural causal model. Advances in Neural Information Processing Systems, 34:4409–4420, 2021.
- Generalized cross entropy loss for training deep neural networks with noisy labels. Advances in neural information processing systems, 31, 2018.
- Rui Zhao (241 papers)
- Bin Shi (38 papers)
- Jianfei Ruan (2 papers)
- Tianze Pan (1 paper)
- Bo Dong (50 papers)