SemEval 2024 -- Task 10: Emotion Discovery and Reasoning its Flip in Conversation (EDiReF) (2402.18944v1)
Abstract: We present SemEval-2024 Task 10, a shared task centred on identifying emotions and finding the rationale behind their flips within monolingual English and Hindi-English code-mixed dialogues. This task comprises three distinct subtasks - emotion recognition in conversation for code-mixed dialogues, emotion flip reasoning for code-mixed dialogues, and emotion flip reasoning for English dialogues. Participating systems were tasked to automatically execute one or more of these subtasks. The datasets for these tasks comprise manually annotated conversations focusing on emotions and triggers for emotion shifts (The task data is available at https://github.com/LCS2-IIITD/EDiReF-SemEval2024.git). A total of 84 participants engaged in this task, with the most adept systems attaining F1-scores of 0.70, 0.79, and 0.76 for the respective subtasks. This paper summarises the results and findings from 24 teams alongside their system descriptions.
- Multi-modal sarcasm detection and humor classification in code-mixed conversations. IEEE Trans. Affect. Comput., 14(2):1363–1375.
- COMET: Commonsense transformers for automatic knowledge graph construction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4762–4779, Florence, Italy. Association for Computational Linguistics.
- Language models are few-shot learners. In Advances in Neural Information Processing Systems, volume 33, pages 1877–1901. Curran Associates, Inc.
- Ze-Jing Chuang and Chung-Hsien Wu. 2004. Multi-modal emotion recognition from speech and text. In International Journal of Computational Linguistics & Chinese Language Processing, Volume 9, Number 2, August 2004: Special Issue on New Trends of Speech and Language Processing, pages 45–62.
- Alan S Cowen and Dacher Keltner. 2017. Self-report captures 27 distinct categories of emotion bridged by continuous gradients. PNAS, 114(38):E7900–E7909.
- Eeg-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network. Knowledge-Based Systems, 205:106243.
- Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
- Paul Ekman. 1992. An argument for basic emotions. Cognition & emotion, 6(3-4):169–200.
- A multi-modal eliza using natural language processing and emotion recognition. In Text, Speech and Dialogue: 6th International Conference, TSD 2003, České Budéjovice, Czech Republic, September 8-12, 2003. Proceedings 6, pages 394–399. Springer.
- DialogueGCN: A graph convolutional neural network for emotion recognition in conversation. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 154–164, Hong Kong, China.
- ICON: Interactive conversational memory network for multimodal emotion detection. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2594–2604, Brussels, Belgium.
- Conversational memory network for emotion recognition in dyadic dialogue videos. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 2122–2132, New Orleans, Louisiana.
- Conversational transfer learning for emotion recognition. Information Fusion, 65:1–12.
- Emotion detection in code-mixed roman urdu - english text. ACM Trans. Asian Low-Resour. Lang. Inf. Process., 22(2).
- Mistral 7b.
- Real-time emotion recognition via attention gated hierarchical memory network. In AAAI, volume 34, pages 8002–8009.
- Higru: Hierarchical gated recurrent units for utterance-level emotion recognition. arXiv preprint arXiv:1904.04446.
- Klaus Krippendorff. 2011. Computing krippendorff’s alpha-reliability.
- Emotion flip reasoning in multiparty conversations. IEEE Transactions on Artificial Intelligence, pages 1–10.
- When did you become so smart, oh wise one?! sarcasm explanation in multi-modal multi-party dialogues. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5956–5968, Dublin, Ireland. Association for Computational Linguistics.
- Explaining (sarcastic) utterances to enhance affect understanding in multimodal dialogues.
- From multilingual complexity to emotional clarity: Leveraging commonsense to unveil emotions in code-mixed dialogues. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 9638–9652, Singapore. Association for Computational Linguistics.
- Discovering emotion and reasoning its flip in multi-party conversations using masked memory network and transformer. arXiv 2103.12360 (cs.CL).
- A text-driven rule-based system for emotion cause detection. In Proceedings of the NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text, pages 45–53, Los Angeles, CA.
- Research on textual emotion recognition incorporating personality factor. In 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), pages 2222–2227. IEEE.
- Bieru: bidirectional emotional recurrent unit for conversational sentiment analysis. arXiv preprint arXiv:2006.00492.
- Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
- A multi-view network for real-time emotion recognition in conversations. Knowledge-Based Systems, 236:107751.
- Detecting offensive speech in conversational code-mixed dialogue on social media: A contextual dataset and benchmark experiments. Expert Systems with Applications, 215:119342.
- Speech emotion recognition using amplitude modulation parameters and a combined feature selection procedure. Knowledge-Based Systems, 63:68–81.
- Gpt-4 technical report.
- Rosalind W. Picard. 1997. Affective Computing. MIT Press, Cambridge, MA, USA.
- Context-dependent sentiment analysis in user-generated videos. In ACL, pages 873–883.
- MELD: A multimodal multi-party dataset for emotion recognition in conversations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 527–536, Florence, Italy.
- Recognizing emotion cause in conversations. Cognitive Computation, 13:1317–1332.
- Language models are unsupervised multitask learners. OpenAI blog, 1(8):9.
- Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter.
- Emotion detection in hinglish(hindi+english) code-mixed social media text. Procedia Computer Science, 171:1346–1352. Third International Conference on Computing and Network Communications (CoCoNet’19).
- Dialogxl: All-in-one xlnet for multi-party conversation emotion recognition. arXiv preprint arXiv:2012.08695.
- Andi Suciati and Indra Budi. 2020. Aspect-based sentiment analysis and emotion detection for code-mixed review. International Journal of Advanced Computer Science and Applications, 11(9).
- Llama: Open and efficient foundation language models.
- Context- and sentiment-aware networks for emotion recognition in conversation. IEEE Transactions on Artificial Intelligence, 3(5):699–708.
- Zephyr: Direct distillation of lm alignment.
- Anshul Wadhawan and Akshita Aggarwal. 2021. Towards emotion recognition in hindi-english code-mixed data: A transformer based approach.
- Multimodal emotion-cause pair extraction in conversations. IEEE Transactions on Affective Computing, 14(3):1832–1844.
- Context-aware self-attention networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01):387–394.
- Hybrid curriculum learning for emotion recognition in conversation. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 11595–11603.
- Intelligent facial emotion recognition using moth-firefly optimization. Knowledge-Based Systems, 111:248–267.
- Knowledge-enriched transformer for emotion detection in textual conversations. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 165–176, Hong Kong, China.
- Leveraging bilingual-view parallel translation for code-switched emotion detection with adversarial dual-channel encoder. Knowledge-Based Systems, 235:107436.
- Shivani Kumar (13 papers)
- Md Shad Akhtar (54 papers)
- Erik Cambria (136 papers)
- Tanmoy Chakraborty (224 papers)