2000 character limit reached
Efficient Dialogue State Tracking by Masked Hierarchical Transformer (2106.14433v1)
Published 28 Jun 2021 in cs.CL
Abstract: This paper describes our approach to DSTC 9 Track 2: Cross-lingual Multi-domain Dialog State Tracking, the task goal is to build a Cross-lingual dialog state tracker with a training set in rich resource language and a testing set in low resource language. We formulate a method for joint learning of slot operation classification task and state tracking task respectively. Furthermore, we design a novel mask mechanism for fusing contextual information about dialogue, the results show the proposed model achieves excellent performance on DSTC Challenge II with a joint accuracy of 62.37% and 23.96% in MultiWOZ(en - zh) dataset and CrossWOZ(zh - en) dataset, respectively.
- Min Mao (2 papers)
- Jiasheng Liu (2 papers)
- Jingyao Zhou (3 papers)
- Haipang Wu (5 papers)