Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ItemSage: Learning Product Embeddings for Shopping Recommendations at Pinterest (2205.11728v1)

Published 24 May 2022 in cs.IR and cs.LG

Abstract: Learned embeddings for products are an important building block for web-scale e-commerce recommendation systems. At Pinterest, we build a single set of product embeddings called ItemSage to provide relevant recommendations in all shopping use cases including user, image and search based recommendations. This approach has led to significant improvements in engagement and conversion metrics, while reducing both infrastructure and maintenance cost. While most prior work focuses on building product embeddings from features coming from a single modality, we introduce a transformer-based architecture capable of aggregating information from both text and image modalities and show that it significantly outperforms single modality baselines. We also utilize multi-task learning to make ItemSage optimized for several engagement types, leading to a candidate generation system that is efficient for all of the engagement objectives of the end-to-end recommendation system. Extensive offline experiments are conducted to illustrate the effectiveness of our approach and results from online A/B experiments show substantial gains in key business metrics (up to +7% gross merchandise value/user and +11% click volume).

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Paul Baltescu (4 papers)
  2. Haoyu Chen (71 papers)
  3. Nikil Pancha (6 papers)
  4. Andrew Zhai (13 papers)
  5. Jure Leskovec (233 papers)
  6. Charles Rosenberg (12 papers)
Citations (28)
X Twitter Logo Streamline Icon: https://streamlinehq.com