Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Unified Framework for Multi-Domain CTR Prediction via Large Language Models (2312.10743v4)

Published 17 Dec 2023 in cs.IR

Abstract: Click-Through Rate (CTR) prediction is a crucial task in online recommendation platforms as it involves estimating the probability of user engagement with advertisements or items by clicking on them. Given the availability of various services like online shopping, ride-sharing, food delivery, and professional services on commercial platforms, recommendation systems in these platforms are required to make CTR predictions across multiple domains rather than just a single domain. However, multi-domain click-through rate (MDCTR) prediction remains a challenging task in online recommendation due to the complex mutual influence between domains. Traditional MDCTR models typically encode domains as discrete identifiers, ignoring rich semantic information underlying. Consequently, they can hardly generalize to new domains. Besides, existing models can be easily dominated by some specific domains, which results in significant performance drops in the other domains (i.e. the "seesaw phenomenon"). In this paper, we propose a novel solution Uni-CTR to address the above challenges. Uni-CTR leverages a backbone LLM to learn layer-wise semantic representations that capture commonalities between domains. Uni-CTR also uses several domain-specific networks to capture the characteristics of each domain. Note that we design a masked loss strategy so that these domain-specific networks are decoupled from backbone LLM. This allows domain-specific networks to remain unchanged when incorporating new or removing domains, thereby enhancing the flexibility and scalability of the system significantly. Experimental results on three public datasets show that Uni-CTR outperforms the state-of-the-art (SOTA) MDCTR models significantly. Furthermore, Uni-CTR demonstrates remarkable effectiveness in zero-shot prediction. We have applied Uni-CTR in industrial scenarios, confirming its efficiency.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Zichuan Fu (6 papers)
  2. Xiangyang Li (58 papers)
  3. Chuhan Wu (87 papers)
  4. Yichao Wang (45 papers)
  5. Kuicai Dong (17 papers)
  6. Xiangyu Zhao (192 papers)
  7. Mengchen Zhao (8 papers)
  8. Huifeng Guo (60 papers)
  9. Ruiming Tang (171 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.