Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modeling Relational Patterns for Logical Query Answering over Knowledge Graphs (2303.11858v2)

Published 21 Mar 2023 in cs.DB, cs.AI, and cs.LG

Abstract: Answering first-order logical (FOL) queries over knowledge graphs (KG) remains a challenging task mainly due to KG incompleteness. Query embedding approaches this problem by computing the low-dimensional vector representations of entities, relations, and logical queries. KGs exhibit relational patterns such as symmetry and composition and modeling the patterns can further enhance the performance of query embedding models. However, the role of such patterns in answering FOL queries by query embedding models has not been yet studied in the literature. In this paper, we fill in this research gap and empower FOL queries reasoning with pattern inference by introducing an inductive bias that allows for learning relation patterns. To this end, we develop a novel query embedding method, RoConE, that defines query regions as geometric cones and algebraic query operators by rotations in complex space. RoConE combines the advantages of Cone as a well-specified geometric representation for query embedding, and also the rotation operator as a powerful algebraic operation for pattern inference. Our experimental results on several benchmark datasets confirm the advantage of relational patterns for enhancing logical query answering task.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yunjie He (8 papers)
  2. Mojtaba Nayyeri (29 papers)
  3. Bo Xiong (84 papers)
  4. Evgeny Kharlamov (34 papers)
  5. Steffen Staab (78 papers)
  6. Yuqicheng Zhu (12 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.