Papers
Topics
Authors
Recent
2000 character limit reached

Label Distribution Learning from Logical Label (2303.06847v2)

Published 13 Mar 2023 in cs.LG and cs.AI

Abstract: Label distribution learning (LDL) is an effective method to predict the label description degree (a.k.a. label distribution) of a sample. However, annotating label distribution (LD) for training samples is extremely costly. So recent studies often first use label enhancement (LE) to generate the estimated label distribution from the logical label and then apply external LDL algorithms on the recovered label distribution to predict the label distribution for unseen samples. But this step-wise manner overlooks the possible connections between LE and LDL. Moreover, the existing LE approaches may assign some description degrees to invalid labels. To solve the above problems, we propose a novel method to learn an LDL model directly from the logical label, which unifies LE and LDL into a joint model, and avoids the drawbacks of the previous LE methods. Extensive experiments on various datasets prove that the proposed approach can construct a reliable LDL model directly from the logical label, and produce more accurate label distribution than the state-of-the-art LE methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. A maximum entropy approach to natural language processing. Computational Linguistics, 22(1):39–71, 1996.
  2. A study of the robustness of KNN classifiers trained using soft labels. In Artificial Neural Networks in Pattern Recognition, Second IAPR Workshop, pages 67–80. Springer, 2006.
  3. Multilabel ranking with inconsistent rankers. IEEE Transactions on Pattern Analysis & Machine Intelligence, 44(09):5211–5224, 2022.
  4. Xin Geng. Label distribution learning. IEEE Transactions on Knowledge and Data Engineering, 28(7):1734–1748, 2016.
  5. Clustering-aware graph construction: A joint learning perspective. IEEE Trans. Signal Inf. Process. over Networks, 6:357–370, 2020.
  6. Pairwise constraint propagation with dual adversarial manifold regularization. IEEE Trans. Neural Networks Learn. Syst., 31(12):5575–5587, 2020.
  7. SCUT-FBP5500: A diverse benchmark dataset for multi-paradigm facial beauty prediction. In 24th International Conference on Pattern Recognition, pages 1598–1603. IEEE Computer Society, 2018.
  8. Unified framework for learning with label distribution. Inf. Fusion, 75:116–130, 2021.
  9. Bidirectional loss function for label enhancement and distribution learning. Knowledge-Based Systems, 213:106690, 2021.
  10. Construction of a blog emotion corpus for chinese emotional expression analysis. In Proceedings of the 2009 conference on empirical methods in natural language processing, pages 1446–1454, 2009.
  11. Li Shang and Weihong Deng. Blended emotion in-the-wild: Multi-label facial expression recognition using crowdsourced annotations and deep locality feature learning. International Journal of Computer Vision, 127(6-7):884–906, 2019.
  12. Multi-label learning with label enhancement. In 2018 IEEE International Conference on Data Mining (ICDM), pages 437–446, 2018.
  13. Label distribution learning forests. CoRR, abs/1702.06086, 2017.
  14. Multi-label classification: An overview. International Journal of Data Warehousing and Mining, 3(3):1–13, 2007.
  15. Fast label enhancement for label distribution learning. IEEE Transactions on Knowledge and Data Engineering, 35(2):1502–1514, 2023.
  16. Scut-fbp: A benchmark dataset for facial beauty perception. In 2015 IEEE International Conference on Systems, Man, and Cybernetics, pages 1821–1826, 2015.
  17. Latent semantics encoding for label distribution learning. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, pages 3982–3988. ijcai.org, 2019.
  18. Label enhancement for label distribution learning. IEEE Transactions on Knowledge and Data Engineering, 33(4):1632–1643, 2021.
  19. Sparsity conditional energy label distribution learning for age estimation. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, pages 2259–2265. IJCAI/AAAI Press, 2016.
  20. Learning visual sentiment distributions via augmented conditional probability neural network. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, pages 224–230. AAAI Press, 2017.
  21. A review on multi-label learning algorithms. IEEE Transactions on Knowledge and Data Engineering, 26(8):1819–1837, 2014.
  22. Leveraging implicit relative labeling-importance information for effective multi-label learning. IEEE Transactions on Knowledge and Data Engineering, 33(5):2057–2070, 2021.
  23. Label distribution learning by optimal transport. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, (AAAI-18), the 30th innovative Applications of Artificial Intelligence (IAAI-18), and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence (EAAI-18), pages 4506–4513. AAAI Press, 2018.
  24. Generalized label enhancement with sample correlations. IEEE Transactions on Knowledge and Data Engineering, 35(1):482–495, 2023.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.