Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Implicit Session Contexts for Next-Item Recommendations (2208.09076v1)

Published 18 Aug 2022 in cs.IR, cs.LG, and cs.SI

Abstract: Session-based recommender systems capture the short-term interest of a user within a session. Session contexts (i.e., a user's high-level interests or intents within a session) are not explicitly given in most datasets, and implicitly inferring session context as an aggregation of item-level attributes is crude. In this paper, we propose ISCON, which implicitly contextualizes sessions. ISCON first generates implicit contexts for sessions by creating a session-item graph, learning graph embeddings, and clustering to assign sessions to contexts. ISCON then trains a session context predictor and uses the predicted contexts' embeddings to enhance the next-item prediction accuracy. Experiments on four datasets show that ISCON has superior next-item prediction accuracy than state-of-the-art models. A case study of ISCON on the Reddit dataset confirms that assigned session contexts are unique and meaningful.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Sejoon Oh (12 papers)
  2. Ankur Bhardwaj (1 paper)
  3. Jongseok Han (1 paper)
  4. Sungchul Kim (65 papers)
  5. Ryan A. Rossi (124 papers)
  6. Srijan Kumar (61 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.