Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dialogue State Distillation Network with Inter-slot Contrastive Learning for Dialogue State Tracking (2302.08220v2)

Published 16 Feb 2023 in cs.CL

Abstract: In task-oriented dialogue systems, Dialogue State Tracking (DST) aims to extract users' intentions from the dialogue history. Currently, most existing approaches suffer from error propagation and are unable to dynamically select relevant information when utilizing previous dialogue states. Moreover, the relations between the updates of different slots provide vital clues for DST. However, the existing approaches rely only on predefined graphs to indirectly capture the relations. In this paper, we propose a Dialogue State Distillation Network (DSDN) to utilize relevant information of previous dialogue states and migrate the gap of utilization between training and testing. Thus, it can dynamically exploit previous dialogue states and avoid introducing error propagation simultaneously. Further, we propose an inter-slot contrastive learning loss to effectively capture the slot co-update relations from dialogue context. Experiments are conducted on the widely used MultiWOZ 2.0 and MultiWOZ 2.1 datasets. The experimental results show that our proposed model achieves the state-of-the-art performance for DST.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Jing Xu (244 papers)
  2. Dandan Song (12 papers)
  3. Chong Liu (104 papers)
  4. Siu Cheung Hui (30 papers)
  5. Fei Li (232 papers)
  6. Qiang Ju (5 papers)
  7. Xiaonan He (9 papers)
  8. Jian Xie (39 papers)
Citations (4)