Preview, Attend and Review: Schema-Aware Curriculum Learning for Multi-Domain Dialog State Tracking (2106.00291v1)
Abstract: Existing dialog state tracking (DST) models are trained with dialog data in a random order, neglecting rich structural information in a dataset. In this paper, we propose to use curriculum learning (CL) to better leverage both the curriculum structure and schema structure for task-oriented dialogs. Specifically, we propose a model-agnostic framework called Schema-aware Curriculum Learning for Dialog State Tracking (SaCLog), which consists of a preview module that pre-trains a DST model with schema information, a curriculum module that optimizes the model with CL, and a review module that augments mispredicted data to reinforce the CL training. We show that our proposed approach improves DST performance over both a transformer-based and RNN-based DST model (TripPy and TRADE) and achieves new state-of-the-art results on WOZ2.0 and MultiWOZ2.1.
- Yinpei Dai (17 papers)
- Hangyu Li (23 papers)
- Yongbin Li (128 papers)
- Jian Sun (415 papers)
- Fei Huang (409 papers)
- Luo Si (73 papers)
- Xiaodan Zhu (94 papers)