Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AdaSparse: Learning Adaptively Sparse Structures for Multi-Domain Click-Through Rate Prediction (2206.13108v2)

Published 27 Jun 2022 in cs.IR and cs.LG

Abstract: Click-through rate (CTR) prediction is a fundamental technique in recommendation and advertising systems. Recent studies have proved that learning a unified model to serve multiple domains is effective to improve the overall performance. However, it is still challenging to improve generalization across domains under limited training data, and hard to deploy current solutions due to their computational complexity. In this paper, we propose a simple yet effective framework AdaSparse for multi-domain CTR prediction, which learns adaptively sparse structure for each domain, achieving better generalization across domains with lower computational cost. In AdaSparse, we introduce domain-aware neuron-level weighting factors to measure the importance of neurons, with that for each domain our model can prune redundant neurons to improve generalization. We further add flexible sparsity regularizations to control the sparsity ratio of learned structures. Offline and online experiments show that AdaSparse outperforms previous multi-domain CTR models significantly.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xuanhua Yang (5 papers)
  2. Xiaoyu Peng (4 papers)
  3. Penghui Wei (11 papers)
  4. Shaoguo Liu (19 papers)
  5. Liang Wang (512 papers)
  6. Bo Zheng (205 papers)
Citations (25)

Summary

We haven't generated a summary for this paper yet.