Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Micro-Behavior Encoding for Session-based Recommendation (2204.02002v1)

Published 5 Apr 2022 in cs.IR

Abstract: Session-based Recommendation (SR) aims to predict the next item for recommendation based on previously recorded sessions of user interaction. The majority of existing approaches to SR focus on modeling the transition patterns of items. In such models, the so-called micro-behaviors describing how the user locates an item and carries out various activities on it (e.g., click, add-to-cart, and read-comments), are simply ignored. A few recent studies have tried to incorporate the sequential patterns of micro-behaviors into SR models. However, those sequential models still cannot effectively capture all the inherent interdependencies between micro-behavior operations. In this work, we aim to investigate the effects of the micro-behavior information in SR systematically. Specifically, we identify two different patterns of micro-behaviors: "sequential patterns" and "dyadic relational patterns". To build a unified model of user micro-behaviors, we first devise a multigraph to aggregate the sequential patterns from different items via a graph neural network, and then utilize an extended self-attention network to exploit the pair-wise relational patterns of micro-behaviors. Extensive experiments on three public real-world datasets show the superiority of the proposed approach over the state-of-theart baselines and confirm the usefulness of these two different micro-behavior patterns for SR.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jiahao Yuan (16 papers)
  2. Wendi Ji (4 papers)
  3. Dell Zhang (26 papers)
  4. Jinwei Pan (2 papers)
  5. Xiaoling Wang (42 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.