Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System (2302.00907v1)

Published 2 Feb 2023 in cs.CL

Abstract: With the evolution of pre-trained LLMs, current open-domain dialogue systems have achieved great progress in conducting one-session conversations. In contrast, Multi-Session Conversation (MSC), which consists of multiple sessions over a long term with the same user, is under-investigated. In this paper, we propose History-Aware Hierarchical Transformer (HAHT) for multi-session open-domain dialogue. HAHT maintains a long-term memory of history conversations and utilizes history information to understand current conversation context and generate well-informed and context-relevant responses. Specifically, HAHT first encodes history conversation sessions hierarchically into a history memory. Then, HAHT leverages historical information to facilitate the understanding of the current conversation context by encoding the history memory together with the current context with attention-based mechanisms. Finally, to explicitly utilize historical information, HAHT uses a history-aware response generator that switches between a generic vocabulary and a history-aware vocabulary. Experimental results on a large-scale MSC dataset suggest that the proposed HAHT model consistently outperforms baseline models. Human evaluation results support that HAHT generates more human-like, context-relevant and history-relevant responses than baseline models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Tong Zhang (569 papers)
  2. Yong Liu (721 papers)
  3. Boyang Li (106 papers)
  4. Zhiwei Zeng (17 papers)
  5. Pengwei Wang (29 papers)
  6. Yuan You (27 papers)
  7. Chunyan Miao (145 papers)
  8. Lizhen Cui (66 papers)
Citations (11)