Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Uni-Retriever: Towards Learning The Unified Embedding Based Retriever in Bing Sponsored Search (2202.06212v1)

Published 13 Feb 2022 in cs.IR and cs.CL

Abstract: Embedding based retrieval (EBR) is a fundamental building block in many web applications. However, EBR in sponsored search is distinguished from other generic scenarios and technically challenging due to the need of serving multiple retrieval purposes: firstly, it has to retrieve high-relevance ads, which may exactly serve user's search intent; secondly, it needs to retrieve high-CTR ads so as to maximize the overall user clicks. In this paper, we present a novel representation learning framework Uni-Retriever developed for Bing Search, which unifies two different training modes knowledge distillation and contrastive learning to realize both required objectives. On one hand, the capability of making high-relevance retrieval is established by distilling knowledge from the ``relevance teacher model''. On the other hand, the capability of making high-CTR retrieval is optimized by learning to discriminate user's clicked ads from the entire corpus. The two training modes are jointly performed as a multi-objective learning process, such that the ads of high relevance and CTR can be favored by the generated embeddings. Besides the learning strategy, we also elaborate our solution for EBR serving pipeline built upon the substantially optimized DiskANN, where massive-scale EBR can be performed with competitive time and memory efficiency, and accomplished in high-quality. We make comprehensive offline and online experiments to evaluate the proposed techniques, whose findings may provide useful insights for the future development of EBR systems. Uni-Retriever has been mainstreamed as the major retrieval path in Bing's production thanks to the notable improvements on the representation and EBR serving quality.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Jianjin Zhang (9 papers)
  2. Zheng Liu (312 papers)
  3. Weihao Han (8 papers)
  4. Shitao Xiao (38 papers)
  5. Ruicheng Zheng (1 paper)
  6. Yingxia Shao (54 papers)
  7. Hao Sun (383 papers)
  8. Hanqing Zhu (22 papers)
  9. Premkumar Srinivasan (1 paper)
  10. Denvy Deng (9 papers)
  11. Qi Zhang (785 papers)
  12. Xing Xie (220 papers)
Citations (24)

Summary

We haven't generated a summary for this paper yet.