Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Conversational Recommender System and Large Language Model Are Made for Each Other in E-commerce Pre-sales Dialogue (2310.14626v2)

Published 23 Oct 2023 in cs.CL and cs.IR

Abstract: E-commerce pre-sales dialogue aims to understand and elicit user needs and preferences for the items they are seeking so as to provide appropriate recommendations. Conversational recommender systems (CRSs) learn user representation and provide accurate recommendations based on dialogue context, but rely on external knowledge. LLMs generate responses that mimic pre-sales dialogues after fine-tuning, but lack domain-specific knowledge for accurate recommendations. Intuitively, the strengths of LLM and CRS in E-commerce pre-sales dialogues are complementary, yet no previous work has explored this. This paper investigates the effectiveness of combining LLM and CRS in E-commerce pre-sales dialogues, proposing two collaboration methods: CRS assisting LLM and LLM assisting CRS. We conduct extensive experiments on a real-world dataset of Ecommerce pre-sales dialogues. We analyze the impact of two collaborative approaches with two CRSs and two LLMs on four tasks of Ecommerce pre-sales dialogue. We find that collaborations between CRS and LLM can be very effective in some cases.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yuanxing Liu (8 papers)
  2. Wei-Nan Zhang (19 papers)
  3. Yifan Chen (164 papers)
  4. Yuchi Zhang (13 papers)
  5. Haopeng Bai (2 papers)
  6. Fan Feng (50 papers)
  7. Hengbin Cui (5 papers)
  8. Yongbin Li (128 papers)
  9. Wanxiang Che (152 papers)
Citations (10)
Youtube Logo Streamline Icon: https://streamlinehq.com