Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Representations from Product Titles for Modeling Shopping Transactions (1811.01166v3)

Published 3 Nov 2018 in cs.IR

Abstract: Shopping transaction analysis is important for understanding the shopping behaviors of customers. Existing models such as association rules are poor at modeling products that have short purchase histories and cannot be applied to new products (the cold-start problem). In this paper, we propose BASTEXT, an efficient model of shopping baskets and the texts associated with the products (e.g., product titles). The model's goal is to learn the product representations from the textual contents to capture the relationships between the products in the baskets. Given the products already in a basket, a classifier identifies whether a potential product is relevant to the basket based on their vector representations. This relevancy enables us to learn high-quality representations of the products. The experiments demonstrate that BASTEXT can efficiently model millions of baskets and that it outperforms the state-of-the-art methods in the next product recommendation task. We also show that BASTEXT is a strong baseline for keyword-based product search.

Citations (1)

Summary

We haven't generated a summary for this paper yet.