Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

R-GCN: The R Could Stand for Random (2203.02424v2)

Published 4 Mar 2022 in cs.LG, cs.AI, and stat.ML

Abstract: The inception of the Relational Graph Convolutional Network (R-GCN) marked a milestone in the Semantic Web domain as a widely cited method that generalises end-to-end hierarchical representation learning to Knowledge Graphs (KGs). R-GCNs generate representations for nodes of interest by repeatedly aggregating parameterised, relation-specific transformations of their neighbours. However, in this paper, we argue that the the R-GCN's main contribution lies in this "message passing" paradigm, rather than the learned weights. To this end, we introduce the "Random Relational Graph Convolutional Network" (RR-GCN), which leaves all parameters untrained and thus constructs node embeddings by aggregating randomly transformed random representations from neighbours, i.e., with no learned parameters. We empirically show that RR-GCNs can compete with fully trained R-GCNs in both node classification and link prediction settings.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Vic Degraeve (1 paper)
  2. Gilles Vandewiele (10 papers)
  3. Femke Ongenae (12 papers)
  4. Sofie Van Hoecke (24 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.