Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DSKReG: Differentiable Sampling on Knowledge Graph for Recommendation with Relational GNN (2108.11883v1)

Published 26 Aug 2021 in cs.LG

Abstract: In the information explosion era, recommender systems (RSs) are widely studied and applied to discover user-preferred information. A RS performs poorly when suffering from the cold-start issue, which can be alleviated if incorporating Knowledge Graphs (KGs) as side information. However, most existing works neglect the facts that node degrees in KGs are skewed and massive amount of interactions in KGs are recommendation-irrelevant. To address these problems, in this paper, we propose Differentiable Sampling on Knowledge Graph for Recommendation with Relational GNN (DSKReG) that learns the relevance distribution of connected items from KGs and samples suitable items for recommendation following this distribution. We devise a differentiable sampling strategy, which enables the selection of relevant items to be jointly optimized with the model training procedure. The experimental results demonstrate that our model outperforms state-of-the-art KG-based recommender systems. The code is available online at https://github.com/YuWang-1024/DSKReG.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yu Wang (939 papers)
  2. Zhiwei Liu (114 papers)
  3. Ziwei Fan (22 papers)
  4. Lichao Sun (186 papers)
  5. Philip S. Yu (592 papers)
Citations (45)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com

GitHub