Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Joint Multi-grained Popularity-aware Graph Convolution Collaborative Filtering for Recommendation (2210.04614v1)

Published 10 Oct 2022 in cs.IR

Abstract: Graph Convolution Networks (GCNs), with their efficient ability to capture high-order connectivity in graphs, have been widely applied in recommender systems. Stacking multiple neighbor aggregation is the major operation in GCNs. It implicitly captures popularity features because the number of neighbor nodes reflects the popularity of a node. However, existing GCN-based methods ignore a universal problem: users' sensitivity to item popularity is differentiated, but the neighbor aggregations in GCNs actually fix this sensitivity through Graph Laplacian Normalization, leading to suboptimal personalization. In this work, we propose to model multi-grained popularity features and jointly learn them together with high-order connectivity, to match the differentiation of user preferences exhibited in popularity features. Specifically, we develop a Joint Multi-grained Popularity-aware Graph Convolution Collaborative Filtering model, short for JMP-GCF, which uses a popularity-aware embedding generation to construct multi-grained popularity features, and uses the idea of joint learning to capture the signals within and between different granularities of popularity features that are relevant for modeling user preferences. Additionally, we propose a multistage stacked training strategy to speed up model convergence. We conduct extensive experiments on three public datasets to show the state-of-the-art performance of JMP-GCF.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Kang Liu (207 papers)
  2. Feng Xue (60 papers)
  3. Xiangnan He (200 papers)
  4. Dan Guo (66 papers)
  5. Richang Hong (117 papers)
Citations (30)

Summary

We haven't generated a summary for this paper yet.