Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Smoothing Dialogue States for Open Conversational Machine Reading (2108.12599v2)

Published 28 Aug 2021 in cs.CL, cs.AI, cs.HC, and cs.IR

Abstract: Conversational machine reading (CMR) requires machines to communicate with humans through multi-turn interactions between two salient dialogue states of decision making and question generation processes. In open CMR settings, as the more realistic scenario, the retrieved background knowledge would be noisy, which results in severe challenges in the information transmission. Existing studies commonly train independent or pipeline systems for the two subtasks. However, those methods are trivial by using hard-label decisions to activate question generation, which eventually hinders the model performance. In this work, we propose an effective gating strategy by smoothing the two dialogue states in only one decoder and bridge decision making and question generation to provide a richer dialogue state reference. Experiments on the OR-ShARC dataset show the effectiveness of our method, which achieves new state-of-the-art results.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Zhuosheng Zhang (125 papers)
  2. Siru Ouyang (22 papers)
  3. Hai Zhao (227 papers)
  4. Masao Utiyama (39 papers)
  5. Eiichiro Sumita (31 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.