Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Partial Label Supervision for Agnostic Generative Noisy Label Learning (2308.01184v2)

Published 2 Aug 2023 in cs.CV and cs.LG

Abstract: Noisy label learning has been tackled with both discriminative and generative approaches. Despite the simplicity and efficiency of discriminative methods, generative models offer a more principled way of disentangling clean and noisy labels and estimating the label transition matrix. However, existing generative methods often require inferring additional latent variables through costly generative modules or heuristic assumptions, which hinder adaptive optimisation for different causal directions. They also assume a uniform clean label prior, which does not reflect the sample-wise clean label distribution and uncertainty. In this paper, we propose a novel framework for generative noisy label learning that addresses these challenges. First, we propose a new single-stage optimisation that directly approximates image generation by a discriminative classifier output. This approximation significantly reduces the computation cost of image generation, preserves the generative modelling benefits, and enables our framework to be agnostic in regards to different causality scenarios (i.e., image generate label or vice-versa). Second, we introduce a new Partial Label Supervision (PLS) for noisy label learning that accounts for both clean label coverage and uncertainty. The supervision of PLS does not merely aim at minimising loss, but seeks to capture the underlying sample-wise clean label distribution and uncertainty. Extensive experiments on computer vision and NLP benchmarks demonstrate that our generative modelling achieves state-of-the-art results while significantly reducing the computation cost. Our code is available at https://github.com/lfb-1/GNL.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (70)
  1. Unsupervised label noise modeling and loss correction. In International conference on machine learning, pages 312–321. PMLR, 2019.
  2. A closer look at memorization in deep networks. In Proceedings of the 34th International Conference on Machine Learning-Volume 70, pages 233–242. JMLR. org, 2017.
  3. From noisy prediction to true label: Noisy prediction calibration via generative model, 2022.
  4. Understanding and utilizing deep neural networks trained with noisy labels. In International Conference on Machine Learning, pages 1062–1070. PMLR, 2019.
  5. Y. Chen and et al. Boosting co-teaching with compression regularization for label noise. In CVPR, pages 2688–2692, 2021.
  6. D. Cheng and et al. Instance-dependent label-noise learning with manifold-regularized transition matrix estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 16630–16639, 2022.
  7. Learning with instance-dependent label noise: A sample sieve approach. In International Conference on Learning Representations, 2021.
  8. Maximum likelihood from incomplete data via the em algorithm. Journal of the Royal Statistical Society: Series B (Methodological), 39(1):1–22, 1977.
  9. Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition, pages 248–255. Ieee, 2009.
  10. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018.
  11. A. Garg and et al. Instance-dependent noisy label learning via graphical modelling. WACV, 2022.
  12. Instance-dependent noisy label learning via graphical modelling. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 2288–2298, 2023.
  13. Co-teaching: Robust training of deep neural networks with extremely noisy labels. In Advances in neural information processing systems, pages 8527–8537, 2018.
  14. Deep residual learningfor image recognition. ComputerScience, 2015.
  15. Beyond synthetic noise: Deep learning on controlled noisy labels. In International Conference on Machine Learning, pages 4804–4815. PMLR, 2020.
  16. Mentornet: Learning data-driven curriculum for very deep neural networks on corrupted labels. In International Conference on Machine Learning, pages 2304–2313. PMLR, 2018.
  17. Blind knowledge distillation for robust image classification. arXiv preprint arXiv:2211.11355, 2022.
  18. N. Karim and et al. Unicon: Combating label noise through uniform selection and contrastive learning. In CVPR, pages 9676–9686, 2022.
  19. Fine samples for learning with noisy labels. Advances in Neural Information Processing Systems, 34:24137–24149, 2021.
  20. D. P. Kingma. Variational inference & deep learning: A new synthesis. 2017.
  21. D. P. Kingma and M. Welling. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013.
  22. Uncertainty based detection and relabeling of noisy image labels. In CVPR Workshops, pages 33–37, 2019.
  23. Learning multiple layers of features from tiny images. 2009.
  24. Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6):84–90, 2017.
  25. K. Lang. Newsweeder: Learning to filter netnews. In Machine learning proceedings 1995, pages 331–339. Elsevier, 1995.
  26. J. Li and et al. Dividemix: Learning with noisy labels as semi-supervised learning. ICLR, 2020.
  27. Webvision database: Visual learning and understanding from web data. arXiv preprint arXiv:1708.02862, 2017.
  28. Provably end-to-end label-noise learning without anchor points. arXiv preprint arXiv:2102.02400, 2021.
  29. A survey on deep learning in medical image analysis. Medical image analysis, 42:60–88, 2017.
  30. Early-learning regularization prevents memorization of noisy labels. Advances in neural information processing systems, 33:20331–20342, 2020.
  31. Identifiability of label noise transition matrix. arXiv preprint arXiv:2202.02016, 2022.
  32. Y. Liu and H. Guo. Peer loss functions: Learning from noisy labels without knowing noise rates. In International Conference on Machine Learning, pages 6226–6236. PMLR, 2020.
  33. Does label smoothing mitigate label noise? In International Conference on Machine Learning, 2020.
  34. Progressive identification of true labels for partial-label learning. In international conference on machine learning, pages 6500–6510. PMLR, 2020.
  35. E. Malach and S. Shalev-Shwartz. Decoupling" when to update" from" how to update". Advances in neural information processing systems, 30, 2017.
  36. Multi-objective interpolation training for robustness to label noise. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 6606–6615, 2021.
  37. Making deep neural networks robust to label noise: A loss correction approach. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1944–1952, 2017.
  38. J. Pearl. Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan kaufmann, 1988.
  39. Decompositional generation process for instance-dependent partial label learning. In The Eleventh International Conference on Learning Representations, 2022.
  40. Resolving label uncertainty with implicit posterior models. arXiv preprint arXiv:2202.14000, 2022.
  41. Y. Shen and S. Sanghavi. Learning with bad training data via iterative trimmed loss minimization. In International Conference on Machine Learning, pages 5739–5748. PMLR, 2019.
  42. Selfie: Refurbishing unclean samples for robust deep learning. In International Conference on Machine Learning, pages 5907–5915. PMLR, 2019.
  43. Partial label learning: Taxonomy, analysis and outlook. Neural Networks, 2023.
  44. Pico+: Contrastive label disambiguation for robust partial label learning, 2022.
  45. Chestx-ray8: Hospital-scale chest x-ray database and benchmarks on weakly-supervised classification and localization of common thorax diseases. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 2097–2106, 2017.
  46. Iterative learning with open-set noisy labels. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 8688–8696, 2018.
  47. Symmetric cross entropy for robust learning with noisy labels. In Proceedings of the IEEE International Conference on Computer Vision, pages 322–330, 2019.
  48. Combating noisy labels by agreement: A joint training method with co-regularization. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13726–13735, 2020.
  49. Understanding (generalized) label smoothing whenlearning with noisy labels. arXiv preprint arXiv:2106.04149, 2021.
  50. J. Wei and Y. Liu. When optimizing f-divergence is robust with label noise. In International Conference on Learning Representation, 2021.
  51. Learning with noisy labels revisited: A study using real-world human annotations. In International Conference on Learning Representations, 2021.
  52. Learning with noisy labels revisited: A study using real-world human annotations. In The Tenth International Conference on Learning Representations, ICLR 2022, Virtual Event, April 25-29, 2022. OpenReview.net, 2022.
  53. Part-dependent label noise: Towards instance-dependent label noise. Advances in Neural Information Processing Systems, 33:7597–7610, 2020.
  54. Are anchor points really indispensable in label-noise learning? Advances in Neural Information Processing Systems, 32, 2019.
  55. Learning from massive noisy labeled data for image classification. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 2691–2699, 2015.
  56. L_dmi: A novel information-theoretic loss function for training deep nets robust to label noise. In Advances in Neural Information Processing Systems, pages 6225–6236, 2019.
  57. Faster meta update strategy for noise-robust deep learning. In CVPR, pages 144–153, 2021.
  58. Estimating instance-dependent bayes-label transition matrix using a deep neural network. In K. Chaudhuri, S. Jegelka, L. Song, C. Szepesvári, G. Niu, and S. Sabato, editors, International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA, volume 162 of Proceedings of Machine Learning Research, pages 25302–25312. PMLR, 2022.
  59. Which is better for learning with noisy labels: The semi-supervised method or modeling label noise? In A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, and J. Scarlett, editors, Proceedings of the 40th International Conference on Machine Learning, volume 202 of Proceedings of Machine Learning Research, pages 39660–39673. PMLR, 23–29 Jul 2023.
  60. Which is better for learning with noisy labels: the semi-supervised method or modeling label noise? In International Conference on Machine Learning, pages 39660–39673. PMLR, 2023.
  61. Instance-dependent label-noise learning under a structural causal model. Advances in Neural Information Processing Systems, 34:4409–4420, 2021.
  62. Dual T: Reducing estimation error for transition matrix in label-noise learning. In Advances in Neural Information Processing Systems, volume 33, pages 7260–7271, 2020.
  63. Recent trends in deep learning based natural language processing. ieee Computational intelligenCe magazine, 13(3):55–75, 2018.
  64. Exploiting class activation value for partial-label learning. In International Conference on Learning Representations, 2021.
  65. mixup: Beyond empirical risk minimization. arXiv preprint arXiv:1710.09412, 2017.
  66. Character-level convolutional networks for text classification. Advances in neural information processing systems, 28, 2015.
  67. Learning with feature-dependent label noise: A progressive approach. arXiv preprint arXiv:2103.07756, 2021.
  68. W. Zhou and M. Chen. Learning from noisy labels for entity-centric information extraction. arXiv preprint arXiv:2104.08656, 2021.
  69. A second-order approach to learning with instance-dependent label noise. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10113–10123, 2021.
  70. Dygen: Learning from noisy labels via dynamics-enhanced generative modeling. arXiv preprint arXiv:2305.19395, 2023.

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com