Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Class-Imbalanced Complementary-Label Learning via Weighted Loss (2209.14189v2)

Published 28 Sep 2022 in cs.LG

Abstract: Complementary-label learning (CLL) is widely used in weakly supervised classification, but it faces a significant challenge in real-world datasets when confronted with class-imbalanced training samples. In such scenarios, the number of samples in one class is considerably lower than in other classes, which consequently leads to a decline in the accuracy of predictions. Unfortunately, existing CLL approaches have not investigate this problem. To alleviate this challenge, we propose a novel problem setting that enables learning from class-imbalanced complementary labels for multi-class classification. To tackle this problem, we propose a novel CLL approach called Weighted Complementary-Label Learning (WCLL). The proposed method models a weighted empirical risk minimization loss by utilizing the class-imbalanced complementary labels, which is also applicable to multi-class imbalanced training samples. Furthermore, we derive an estimation error bound to provide theoretical assurance. To evaluate our approach, we conduct extensive experiments on several widely-used benchmark datasets and a real-world dataset, and compare our method with existing state-of-the-art methods. The proposed approach shows significant improvement in these datasets, even in the case of multiple class-imbalanced scenarios. Notably, the proposed method not only utilizes complementary labels to train a classifier but also solves the problem of class imbalance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. A systematic study of the class imbalance problem in convolutional neural networks. Neural networks 106, 249–259.
  2. What is the effect of importance weighting in deep learning?, in: International Conference on Machine Learning, pp. 872–881.
  3. Partial optimal tranport with applications on positive-unlabeled learning. Advances in Neural Information Processing Systems 33, 2903–2913.
  4. A discussion of semi-supervised learning and transduction, in: Semi-supervised learning, pp. 473–478.
  5. Class-imbalanced deep learning via a class-balanced ensemble. IEEE transactions on neural networks and learning systems 33, 5626–5640.
  6. Unbiased risk estimators can mislead: A case study of learning with complementary labels, in: International Conference on Machine Learning, pp. 1929–1938.
  7. Imbalanced deep learning by minority class incremental rectification. IEEE transactions on pattern analysis and machine intelligence 41, 1367–1381.
  8. Convex formulation for learning from positive and unlabeled data, in: International conference on machine learning, pp. 1386–1394.
  9. Clustering unclustered data: Unsupervised binary labeling of two datasets having different class balances, in: 2013 Conference on Technologies and Applications of Artificial Intelligence, pp. 1–6.
  10. Analysis of learning from positive and unlabeled data. Advances in neural information processing systems 27, 703–711.
  11. Learning with multiple complementary labels, in: International Conference on Machine Learning, pp. 3072–3081.
  12. Dynamically weighted balanced loss: class imbalanced learning and confidence calibration of deep neural networks. IEEE Transactions on Neural Networks and Learning Systems 33, 2940–2951.
  13. Knn weighted reduced universum twin svm for class imbalance learning. Knowledge-Based Systems 245, 108578.
  14. Large-scale fuzzy least squares twin svms for class imbalance learning. IEEE Transactions on Fuzzy Systems 30, 4815–4827.
  15. Learning from noisy labels with complementary loss functions, in: Proceedings of the 32nd International Joint Conference on Artificial Intelligence.
  16. Discriminative complementary-label learning with weighted loss, in: International Conference on Machine Learning, pp. 3587–3597.
  17. Recovering the propensity score from biased positive unlabeled data , 6694–6702.
  18. Robust loss functions under label noise for deep neural networks, in: Proceedings of the AAAI conference on artificial intelligence, pp. 1919–1925.
  19. The information-theoretic value of unlabeled data in semi-supervised learning, in: International Conference on Machine Learning, pp. 2328–2336.
  20. Partial multi-label learning via large margin nearest neighbour embeddings , 6729–6736.
  21. Class-imbalanced semi-supervised learning with adaptive thresholding, in: International Conference on Machine Learning, pp. 8082–8094.
  22. Sigua: Forgetting may make learning with noisy labels more robust, in: International Conference on Machine Learning, pp. 4006–4016.
  23. Learning from imbalanced data. IEEE Transactions on knowledge and data engineering 21, 1263–1284.
  24. Predictive adversarial learning from positive and unlabeled data, in: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 7806–7814.
  25. Learning from complementary labels. Advances in neural information processing systems 30, 5639–5649.
  26. Complementary-label learning for arbitrary losses and models, in: International Conference on Machine Learning, pp. 2971–2980.
  27. Learning from noisy complementary labels with robust loss functions. IEICE TRANSACTIONS on Information and Systems 105, 364–376.
  28. Semi-supervised learning with normalizing flows, in: International Conference on Machine Learning, pp. 4615–4630.
  29. Survey on deep learning with class imbalance. Journal of Big Data 6, 1–54.
  30. Online multiclass classification based on prediction margin for partial feedback. arXiv preprint arXiv:1902.01056 .
  31. Hybrid neural network with cost-sensitive support vector machine for class-imbalanced multimodal data. Neural Networks 130, 176–184.
  32. Weighted kappa loss function for multi-class classification of ordinal data in deep learning. Pattern Recognition Letters 105, 144–154.
  33. Gradient-based learning applied to document recognition. Proceedings of the IEEE 86, 2278–2324.
  34. Large-scale long-tailed recognition in an open world, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2537–2546.
  35. Progressive identification of true labels for partial-label learning, in: International Conference on Machine Learning, pp. 6500–6510.
  36. Dimensionality-driven learning with noisy labels, in: International Conference on Machine Learning, pp. 3355–3364.
  37. Optimally weighted loss functions for solving pdes with neural networks. Journal of Computational and Applied Mathematics 405, 113887.
  38. Learning from corrupted binary labels via class-probability estimation, in: International conference on machine learning, pp. 125–134.
  39. Virtual adversarial training: a regularization method for supervised and semi-supervised learning. IEEE transactions on pattern analysis and machine intelligence 41, 1979–1993.
  40. Foundations of machine learning.
  41. Recurrent generative adversarial network for learning imbalanced medical image semantic segmentation. Multimedia Tools and Applications 79, 15329–15348.
  42. A reduced universum twin support vector machine for class imbalance learning. Pattern Recognition 102, 107150.
  43. Semi-supervised auc optimization based on positive-unlabeled learning. Machine Learning 107, 767–794.
  44. Positive-unlabeled learning from imbalanced data., in: IJCAI, pp. 2995–3001.
  45. Confidence-rated discriminative partial label learning, in: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 2611–2617.
  46. Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results. Advances in neural information processing systems 30, 1195–1204.
  47. 80 million tiny images: A large data set for nonparametric object and scene recognition. IEEE transactions on pattern analysis and machine intelligence 30, 1958–1970.
  48. Symmetric cross entropy for robust learning with noisy labels, in: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 322–330.
  49. Partial multi-label learning, in: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 4302–4309.
  50. Generative-discriminative complementary learning, in: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 6526–6533.
  51. Margin calibration in svm class-imbalanced learning. Neurocomputing 73, 397–411.
  52. Learning with biased complementary labels, in: Proceedings of the European conference on computer vision (ECCV), pp. 68–83.
  53. Deeprec: A deep neural network approach to recommendation with item embedding and weighted loss function. Information Sciences 470, 121–140.
  54. Exploiting unlabeled data via partial label assignment for multi-class semi-supervised learning, in: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 10973–10980.
Citations (10)

Summary

We haven't generated a summary for this paper yet.