Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modeling Long-Term and Short-Term Interests with Parallel Attentions for Session-based Recommendation (2006.15346v2)

Published 27 Jun 2020 in cs.IR and cs.LG

Abstract: The aim of session-based recommendation is to predict the users' next clicked item, which is a challenging task due to the inherent uncertainty in user behaviors and anonymous implicit feedback information. A powerful session-based recommender can typically explore the users' evolving interests (i.e., a combination of his/her long-term and short-term interests). Recent advances in attention mechanisms have led to state-of-the-art methods for solving this task. However, there are two main drawbacks. First, most of the attention-based methods only simply utilize the last clicked item to represent the user's short-term interest ignoring the temporal information and behavior context, which may fail to capture the recent preference of users comprehensively. Second, current studies typically think long-term and short-term interests as equally important, but the importance of them should be user-specific. Therefore, we propose a novel Parallel Attention Network model (PAN) for Session-based Recommendation. Specifically, we propose a novel time-aware attention mechanism to learn user's short-term interest by taking into account the contextual information and temporal signals simultaneously. Besides, we introduce a gated fusion method that adaptively integrates the user's long-term and short-term preferences to generate the hybrid interest representation. Experiments on the three real-world datasets show that PAN achieves obvious improvements than the state-of-the-art methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Jing Zhu (50 papers)
  2. Yanan Xu (4 papers)
  3. Yanmin Zhu (15 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.