Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

r-GAT: Relational Graph Attention Network for Multi-Relational Graphs (2109.05922v1)

Published 13 Sep 2021 in cs.AI, cs.CL, and cs.LG

Abstract: Graph Attention Network (GAT) focuses on modelling simple undirected and single relational graph data only. This limits its ability to deal with more general and complex multi-relational graphs that contain entities with directed links of different labels (e.g., knowledge graphs). Therefore, directly applying GAT on multi-relational graphs leads to sub-optimal solutions. To tackle this issue, we propose r-GAT, a relational graph attention network to learn multi-channel entity representations. Specifically, each channel corresponds to a latent semantic aspect of an entity. This enables us to aggregate neighborhood information for the current aspect using relation features. We further propose a query-aware attention mechanism for subsequent tasks to select useful aspects. Extensive experiments on link prediction and entity classification tasks show that our r-GAT can model multi-relational graphs effectively. Also, we show the interpretability of our approach by case study.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Meiqi Chen (11 papers)
  2. Yuan Zhang (331 papers)
  3. Xiaoyu Kou (7 papers)
  4. Yuntao Li (19 papers)
  5. Yan Zhang (954 papers)
Citations (11)