Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preview, Attend and Review: Schema-Aware Curriculum Learning for Multi-Domain Dialog State Tracking (2106.00291v1)

Published 1 Jun 2021 in cs.CL

Abstract: Existing dialog state tracking (DST) models are trained with dialog data in a random order, neglecting rich structural information in a dataset. In this paper, we propose to use curriculum learning (CL) to better leverage both the curriculum structure and schema structure for task-oriented dialogs. Specifically, we propose a model-agnostic framework called Schema-aware Curriculum Learning for Dialog State Tracking (SaCLog), which consists of a preview module that pre-trains a DST model with schema information, a curriculum module that optimizes the model with CL, and a review module that augments mispredicted data to reinforce the CL training. We show that our proposed approach improves DST performance over both a transformer-based and RNN-based DST model (TripPy and TRADE) and achieves new state-of-the-art results on WOZ2.0 and MultiWOZ2.1.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yinpei Dai (17 papers)
  2. Hangyu Li (23 papers)
  3. Yongbin Li (128 papers)
  4. Jian Sun (415 papers)
  5. Fei Huang (409 papers)
  6. Luo Si (73 papers)
  7. Xiaodan Zhu (94 papers)
Citations (51)

Summary

We haven't generated a summary for this paper yet.