Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-Task Deep Recommender Systems: A Survey (2302.03525v2)

Published 7 Feb 2023 in cs.IR

Abstract: Multi-task learning (MTL) aims at learning related tasks in a unified model to achieve mutual improvement among tasks considering their shared knowledge. It is an important topic in recommendation due to the demand for multi-task prediction considering performance and efficiency. Although MTL has been well studied and developed, there is still a lack of systematic review in the recommendation community. To fill the gap, we provide a comprehensive review of existing multi-task deep recommender systems (MTDRS) in this survey. To be specific, the problem definition of MTDRS is first given, and it is compared with other related areas. Next, the development of MTDRS is depicted and the taxonomy is introduced from the task relation and methodology aspects. Specifically, the task relation is categorized into parallel, cascaded, and auxiliary with main, while the methodology is grouped into parameter sharing, optimization, and training mechanism. The survey concludes by summarizing the application and public datasets of MTDRS and highlighting the challenges and future directions of the field.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yuhao Wang (144 papers)
  2. Ha Tsz Lam (1 paper)
  3. Yi Wong (2 papers)
  4. Ziru Liu (11 papers)
  5. Xiangyu Zhao (192 papers)
  6. Yichao Wang (45 papers)
  7. Bo Chen (309 papers)
  8. Huifeng Guo (60 papers)
  9. Ruiming Tang (171 papers)
Citations (28)