Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multilingual Dialogue Generation with Shared-Private Memory (1910.02365v1)

Published 6 Oct 2019 in cs.CL and cs.AI

Abstract: Existing dialog systems are all monolingual, where features shared among different languages are rarely explored. In this paper, we introduce a novel multilingual dialogue system. Specifically, we augment the sequence to sequence framework with improved shared-private memory. The shared memory learns common features among different languages and facilitates a cross-lingual transfer to boost dialogue systems, while the private memory is owned by each separate language to capture its unique feature. Experiments conducted on Chinese and English conversation corpora of different scales show that our proposed architecture outperforms the individually learned model with the help of the other language, where the improvement is particularly distinct when the training data is limited.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Chen Chen (752 papers)
  2. Lisong Qiu (1 paper)
  3. Zhenxin Fu (6 papers)
  4. Dongyan Zhao (144 papers)
  5. Junfei Liu (2 papers)
  6. Rui Yan (250 papers)
Citations (7)