Anchoring Path for Inductive Relation Prediction in Knowledge Graphs (2312.13596v1)
Abstract: Aiming to accurately predict missing edges representing relations between entities, which are pervasive in real-world Knowledge Graphs (KGs), relation prediction plays a critical role in enhancing the comprehensiveness and utility of KGs. Recent research focuses on path-based methods due to their inductive and explainable properties. However, these methods face a great challenge when lots of reasoning paths do not form Closed Paths (CPs) in the KG. To address this challenge, we propose Anchoring Path Sentence Transformer (APST) by introducing Anchoring Paths (APs) to alleviate the reliance of CPs. Specifically, we develop a search-based description retrieval method to enrich entity descriptions and an assessment mechanism to evaluate the rationality of APs. APST takes both APs and CPs as the inputs of a unified Sentence Transformer architecture, enabling comprehensive predictions and high-quality explanations. We evaluate APST on three public datasets and achieve state-of-the-art (SOTA) performance in 30 of 36 transductive, inductive, and few-shot experimental settings.
- Tucker: Tensor factorization for knowledge graph completion. arXiv preprint arXiv:1901.09590.
- Translating embeddings for modeling multi-relational data. In Proceedings of Advances in Neural Information Processing Systems, 2787–2795.
- Go for a walk and arrive at the answer: Reasoning over paths in knowledge bases using reinforcement learning. arXiv preprint arXiv:1711.05851.
- Convolutional 2d knowledge graph embeddings. In Proceedings of the AAAI Conference on Artificial Intelligence, 1811–1819.
- Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
- Knowledge graph embedding via dynamic mapping matrix. In Proceedings of the Annual Meeting of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing, 687–696.
- Are graph neural networks really helpful for knowledge graph completion? arXiv preprint arXiv:2205.10652.
- Learning entity and relation embeddings for knowledge graph completion. In Proceedings of AAAI Conference on Artificial Intelligence, 2181–2187.
- Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
- Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781.
- Improving language understanding by generative pre-training. https://openai.com/blog/language-unsupervised/.
- Modeling relational data with graph convolutional networks. In Proceedings of European Semantic Web Conference, 593–607.
- Multi-Aspect Explainable Inductive Relation Prediction by Sentence Transformer. In Proceedings of the AAAI Conference on Artificial Intelligence, 6533–6540.
- Inductive relation prediction by subgraph reasoning. In Proceedings of International Conference on Machine Learning, 9448–9457.
- Knowledge graph completion via complex tensor factorization. arXiv preprint arXiv:1702.06879.
- Composition-based multi-relational graph convolutional networks. arXiv preprint arXiv:1911.03082.
- Knowledge graph embedding by translating on hyperplanes. In Proceedings of the AAAI Conference on Artificial Intelligence, 1112–1119.
- DeepPath: A reinforcement learning method for knowledge graph reasoning. arXiv preprint arXiv:1707.06690.
- KG-BERT: BERT for knowledge graph completion. arXiv preprint arXiv:1909.03193.
- Inductive relation prediction by BERT. In Proceedings of the AAAI Conference on Artificial Intelligence, 5923–5931.
- Rethinking graph convolutional networks in knowledge graph completion. In Proceedings of the ACM Web Conference 2022, 798–807.