Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Ruleformer: Context-aware Differentiable Rule Mining over Knowledge Graph (2209.05815v1)

Published 13 Sep 2022 in cs.LO

Abstract: Rule mining is an effective approach for reasoning over knowledge graph (KG). Existing works mainly concentrate on mining rules. However, there might be several rules that could be applied for reasoning for one relation, and how to select appropriate rules for completion of different triples has not been discussed. In this paper, we propose to take the context information into consideration, which helps select suitable rules for the inference tasks. Based on this idea, we propose a transformer-based rule mining approach, Ruleformer. It consists of two blocks: 1) an encoder extracting the context information from subgraph of head entities with modified attention mechanism, and 2) a decoder which aggregates the subgraph information from the encoder output and generates the probability of relations for each step of reasoning. The basic idea behind Ruleformer is regarding rule mining process as a sequence to sequence task. To make the subgraph a sequence input to the encoder and retain the graph structure, we devise a relational attention mechanism in Transformer. The experiment results show the necessity of considering these information in rule mining task and the effectiveness of our model.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Zezhong Xu (10 papers)
  2. Peng Ye (142 papers)
  3. Hui Chen (298 papers)
  4. Meng Zhao (48 papers)
  5. Huajun Chen (198 papers)
  6. Wen Zhang (170 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.