Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Seq2seq Translation Model for Sequential Recommendation (1912.07274v2)

Published 16 Dec 2019 in cs.IR

Abstract: The context information such as product category plays a critical role in sequential recommendation. Recent years have witnessed a growing interest in context-aware sequential recommender systems. Existing studies often treat the contexts as auxiliary feature vectors without considering the sequential dependency in contexts. However, such a dependency provides valuable clues to predict the user's future behavior. For example, a user might buy electronic accessories after he/she buy an electronic product. In this paper, we propose a novel seq2seq translation architecture to highlight the importance of sequential dependency in contexts for sequential recommendation. Specifically, we first construct a collateral context sequence in addition to the main interaction sequence. We then generalize recent advancements in translation model from sequences of words in two languages to sequences of items and contexts in recommender systems. Taking the category information as an item's context, we develop a basic coupled and an extended tripled seq2seq translation models to encode the category-item and item-category-item relations between the item and context sequences. We conduct extensive experiments on three real world datasets. The results demonstrate the superior performance of the proposed model compared with the state-of-the-art baselines.

Citations (3)

Summary

We haven't generated a summary for this paper yet.