Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems (1905.04413v3)

Published 11 May 2019 in cs.LG, cs.IR, and stat.ML

Abstract: Knowledge graphs capture structured information and relations between a set of entities or items. As such knowledge graphs represent an attractive source of information that could help improve recommender systems. However, existing approaches in this domain rely on manual feature engineering and do not allow for an end-to-end training. Here we propose Knowledge-aware Graph Neural Networks with Label Smoothness regularization (KGNN-LS) to provide better recommendations. Conceptually, our approach computes user-specific item embeddings by first applying a trainable function that identifies important knowledge graph relationships for a given user. This way we transform the knowledge graph into a user-specific weighted graph and then apply a graph neural network to compute personalized item embeddings. To provide better inductive bias, we rely on label smoothness assumption, which posits that adjacent items in the knowledge graph are likely to have similar user relevance labels/scores. Label smoothness provides regularization over the edge weights and we prove that it is equivalent to a label propagation scheme on a graph. We also develop an efficient implementation that shows strong scalability with respect to the knowledge graph size. Experiments on four datasets show that our method outperforms state of the art baselines. KGNN-LS also achieves strong performance in cold-start scenarios where user-item interactions are sparse.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Hongwei Wang (150 papers)
  2. Fuzheng Zhang (60 papers)
  3. Mengdi Zhang (37 papers)
  4. Jure Leskovec (233 papers)
  5. Miao Zhao (14 papers)
  6. Wenjie Li (183 papers)
  7. Zhongyuan Wang (105 papers)
Citations (514)

Summary

Overview of Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems

The paper "Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems" introduces a novel approach to enhance recommendation systems by leveraging the structure of knowledge graphs (KGs). The method is predicated on extending Graph Neural Networks (GNNs) to capture both semantic item relationships and user preferences effectively. This is achieved through the proposed Knowledge-aware Graph Neural Networks with Label Smoothness regularization (KGNN-LS).

Core Methodology

The KGNN-LS framework advances existing KG-aware recommendation techniques by addressing limitations such as dependency on manual feature engineering and lack of end-to-end training capabilities. The process begins with transforming a KG into a user-specific weighted graph via a trainable function that scores the importance of KG relationships for individual users. This graph transformation supports personalized recommendations by aligning the graph neural network's layer-wise propagation with user-specific preferences. To prevent overfitting due to the model's increased flexibility in learning edge weights, the authors implement label smoothness regularization. This regularization ensures that adjacent entities in the KG have similar user relevance scores, incorporating a label propagation scheme that stabilizes and guides the learning process.

Experimental Results

Experiments conducted on four datasets—MovieLens, Book-Crossing, Last.FM, and Dianping-Food—demonstrate KGNN-LS's superior performance over contemporary baselines. The model delivers improved recommendation accuracy, highlighting its efficiency particularly in cold-start scenarios where user-item interactions are sparse. Strong results in recall and AUC metrics validate the system's enhanced ability to generalize and capture relevant patterns in varied datasets.

Theoretical and Practical Implications

The integration of label smoothness regularization with GNN architectures suggests a promising direction for minimizing overfitting in scenarios characterized by sparse user-item interactions. This approach offers a structured pathway to exploit the intrinsic connectivity information in KGs efficiently. Practically, KGNN-LS's scalable implementation is an encouraging feature for real-world applications where systems must handle large and complex graphs.

Future Directions

The proposed method opens several avenues for future research. Exploring variants of GNNs that incorporate different aspects of label smoothness could provide deeper insights into optimizing graph-based learning. Additionally, applying the framework to domains beyond recommender systems, such as link prediction and node classification, could yield broader applicability and reveal further nuances in handling heterogeneous relational data.

In conclusion, KGNN-LS stands as a methodologically sound approach that effectively harnesses the potential of KGs through graph neural network architectures, supported by innovative regularization techniques. This contributes to both the theoretical understanding and practical enhancement of recommendation systems in increasingly data-rich environments.