Mao-Zedong At SemEval-2023 Task 4: Label Represention Multi-Head Attention Model With Contrastive Learning-Enhanced Nearest Neighbor Mechanism For Multi-Label Text Classification (2307.05174v1)
Abstract: The study of human values is essential in both practical and theoretical domains. With the development of computational linguistics, the creation of large-scale datasets has made it possible to automatically recognize human values accurately. SemEval 2023 Task 4\cite{kiesel:2023} provides a set of arguments and 20 types of human values that are implicitly expressed in each argument. In this paper, we present our team's solution. We use the Roberta\cite{liu_roberta_2019} model to obtain the word vector encoding of the document and propose a multi-head attention mechanism to establish connections between specific labels and semantic components. Furthermore, we use a contrastive learning-enhanced K-nearest neighbor mechanism\cite{su_contrastive_2022} to leverage existing instance information for prediction. Our approach achieved an F1 score of 0.533 on the test set and ranked fourth on the leaderboard.
- Mark J Berger. Large scale multi-label text classification with semantic word vectors.
- Multi-label text classification approach for sentence level news emotion analysis. In Pattern Recognition and Machine Intelligence, Lecture Notes in Computer Science, pages 261–266. Springer.
- BERT: Pre-training of deep bidirectional transformers for language understanding.
- Extreme multi-label loss functions for recommendation, tagging, ranking & other missing label applications. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 935–944. Association for Computing Machinery.
- Identifying the Human Values behind Arguments. In 60th Annual Meeting of the Association for Computational Linguistics (ACL 2022), pages 4459–4471. Association for Computational Linguistics.
- Semeval-2023 task 4: Valueeval: Identification of human values behind arguments. In Proceedings of the 17th International Workshop on Semantic Evaluation, Toronto, Canada. Association for Computational Linguistics.
- RoBERTa: A robustly optimized BERT pretraining approach.
- Large-scale multi-label text classification — revisiting neural networks. In Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Computer Science, pages 437–452. Springer.
- Multi-label text classification using attention-based graph neural network. In Proceedings of the 12th International Conference on Agents and Artificial Intelligence, pages 494–505.
- Study on multi-label text classification based on SVM. In 2009 Sixth International Conference on Fuzzy Systems and Knowledge Discovery, volume 1, pages 300–304.
- Evaluating feature selection methods for multi-label text classification.
- Contrastive learning-enhanced nearest neighbor mechanism for multi-label text classification. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 672–679. Association for Computational Linguistics.
- Attention is all you need.
- Label-specific document representation for multi-label text classification. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 466–475. Association for Computational Linguistics.
- SGM: Sequence generation model for multi-label classification.
Collections
Sign up for free to add this paper to one or more collections.