Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
51 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transfer Learning for Context-Aware Question Matching in Information-seeking Conversations in E-commerce (1806.05434v1)

Published 14 Jun 2018 in cs.CL

Abstract: Building multi-turn information-seeking conversation systems is an important and challenging research topic. Although several advanced neural text matching models have been proposed for this task, they are generally not efficient for industrial applications. Furthermore, they rely on a large amount of labeled data, which may not be available in real-world applications. To alleviate these problems, we study transfer learning for multi-turn information seeking conversations in this paper. We first propose an efficient and effective multi-turn conversation model based on convolutional neural networks. After that, we extend our model to adapt the knowledge learned from a resource-rich domain to enhance the performance. Finally, we deployed our model in an industrial chatbot called AliMe Assist (https://consumerservice.taobao.com/online-help) and observed a significant improvement over the existing online model.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Minghui Qiu (58 papers)
  2. Liu Yang (194 papers)
  3. Feng Ji (74 papers)
  4. Weipeng Zhao (3 papers)
  5. Wei Zhou (308 papers)
  6. Jun Huang (126 papers)
  7. Haiqing Chen (29 papers)
  8. W. Bruce Croft (46 papers)
  9. Wei Lin (207 papers)
Citations (24)