Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Teacher-Student Framework Enhanced Multi-domain Dialogue Generation (1908.07137v2)

Published 20 Aug 2019 in cs.CL and cs.AI

Abstract: Dialogue systems dealing with multi-domain tasks are highly required. How to record the state remains a key problem in a task-oriented dialogue system. Normally we use human-defined features as dialogue states and apply a state tracker to extract these features. However, the performance of such a system is limited by the error propagation of a state tracker. In this paper, we propose a dialogue generation model that needs no external state trackers and still benefits from human-labeled semantic data. By using a teacher-student framework, several teacher models are firstly trained in their individual domains, learn dialogue policies from labeled states. And then the learned knowledge and experience are merged and transferred to a universal student model, which takes raw utterance as its input. Experiments show that the dialogue system trained under our framework outperforms the one uses a belief tracker.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shuke Peng (2 papers)
  2. Xinjing Huang (4 papers)
  3. Zehao Lin (38 papers)
  4. Feng Ji (75 papers)
  5. Haiqing Chen (29 papers)
  6. Yin Zhang (98 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.