Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Topic-Driven and Knowledge-Aware Transformer for Dialogue Emotion Detection (2106.01071v1)

Published 2 Jun 2021 in cs.CL

Abstract: Emotion detection in dialogues is challenging as it often requires the identification of thematic topics underlying a conversation, the relevant commonsense knowledge, and the intricate transition patterns between the affective states. In this paper, we propose a Topic-Driven Knowledge-Aware Transformer to handle the challenges above. We firstly design a topic-augmented LLM (LM) with an additional layer specialized for topic detection. The topic-augmented LM is then combined with commonsense statements derived from a knowledge base based on the dialogue contextual information. Finally, a transformer-based encoder-decoder architecture fuses the topical and commonsense information, and performs the emotion label sequence prediction. The model has been experimented on four datasets in dialogue emotion detection, demonstrating its superiority empirically over the existing state-of-the-art approaches. Quantitative and qualitative results show that the model can discover topics which help in distinguishing emotion categories.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Lixing Zhu (63 papers)
  2. Gabriele Pergola (26 papers)
  3. Lin Gui (66 papers)
  4. Deyu Zhou (42 papers)
  5. Yulan He (113 papers)
Citations (134)

Summary

We haven't generated a summary for this paper yet.