Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Conditional Logical Message Passing Transformer for Complex Query Answering (2402.12954v2)

Published 20 Feb 2024 in cs.LG, cs.AI, and cs.LO

Abstract: Complex Query Answering (CQA) over Knowledge Graphs (KGs) is a challenging task. Given that KGs are usually incomplete, neural models are proposed to solve CQA by performing multi-hop logical reasoning. However, most of them cannot perform well on both one-hop and multi-hop queries simultaneously. Recent work proposes a logical message passing mechanism based on the pre-trained neural link predictors. While effective on both one-hop and multi-hop queries, it ignores the difference between the constant and variable nodes in a query graph. In addition, during the node embedding update stage, this mechanism cannot dynamically measure the importance of different messages, and whether it can capture the implicit logical dependencies related to a node and received messages remains unclear. In this paper, we propose Conditional Logical Message Passing Transformer (CLMPT), which considers the difference between constants and variables in the case of using pre-trained neural link predictors and performs message passing conditionally on the node type. We empirically verified that this approach can reduce computational costs without affecting performance. Furthermore, CLMPT uses the transformer to aggregate received messages and update the corresponding node embedding. Through the self-attention mechanism, CLMPT can assign adaptive weights to elements in an input set consisting of received messages and the corresponding node and explicitly model logical dependencies between various elements. Experimental results show that CLMPT is a new state-of-the-art neural CQA model. https://github.com/qianlima-lab/CLMPT.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (70)
  1. Query Embedding on Hyper-Relational Knowledge Graphs. In International Conference on Learning Representations. https://openreview.net/forum?id=4rLw09TgRw9
  2. Neural Methods for Logical Reasoning over Knowledge Graphs.. In Proceedings of the Tenth International Conference on Learning Representations 2022.
  3. Complex Query Answering with Neural Link Predictors. In International Conference on Learning Representations.
  4. Adapting Neural Link Predictors for Data-Efficient Complex Query Answering. In Thirty-seventh Conference on Neural Information Processing Systems.
  5. Query2Particles: Knowledge Graph Reasoning with Particle Embeddings. In Findings of the Association for Computational Linguistics: NAACL 2022. 2703–2714.
  6. Answering complex logical queries on knowledge graphs via query computation tree optimization. In International Conference on Machine Learning. PMLR, 1472–1491.
  7. Freebase: a collaboratively created graph database for structuring human knowledge. In Proceedings of the 2008 ACM SIGMOD international conference on Management of data. 1247–1250.
  8. Translating embeddings for modeling multi-relational data. Advances in neural information processing systems 26 (2013).
  9. Toward an architecture for never-ending language learning. In Proceedings of the AAAI conference on artificial intelligence, Vol. 24. 1306–1313.
  10. Fuzzy logic based logical query answering on knowledge graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 36. 3939–3948.
  11. Probabilistic entity representation model for reasoning over knowledge graphs. Advances in Neural Information Processing Systems 34 (2021), 23440–23451.
  12. Self-supervised hyperboloid representations from logical queries over knowledge graphs. In Proceedings of the Web Conference 2021. 1373–1384.
  13. Brian A Davey and Hilary A Priestley. 2002. Introduction to lattices and order. Cambridge university press.
  14. Daniel Daza and Michael Cochez. 2020. Message passing query embedding. arXiv preprint arXiv:2002.02406 (2020).
  15. Convolutional 2d knowledge graph embeddings. In Proceedings of the AAAI conference on artificial intelligence, Vol. 32.
  16. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
  17. Lise Getoor and Ben Taskar. 2007. Introduction to statistical relational learning. MIT press.
  18. Neural message passing for quantum chemistry. In International conference on machine learning. PMLR, 1263–1272.
  19. Petr Hájek. 2013. Metamathematics of fuzzy logic. Vol. 4. Springer Science & Business Media.
  20. Embedding logical queries on knowledge graphs. Advances in neural information processing systems 31 (2018).
  21. Heterogeneous graph transformer. In Proceedings of the web conference 2020. 2704–2710.
  22. A survey on knowledge graphs: Representation, acquisition, and applications. IEEE transactions on neural networks and learning systems 33, 2 (2021), 494–514.
  23. Triangular norms. Vol. 8. Springer Science & Business Media.
  24. Answering complex queries in knowledge graphs with bidirectional sequence encoders. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 4968–4977.
  25. Canonical tensor decomposition for knowledge base completion. In International Conference on Machine Learning. PMLR, 2863–2872.
  26. Leonid Libkin and Cristina Sirangelo. 2009. Open and Closed World Assumptions in Data Exchange. Description Logics 477 (2009).
  27. Neural-answering logical queries on knowledge graphs. In Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining. 1087–1097.
  28. Mask and reason: Pre-training knowledge graph transformers for complex logical queries. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 1120–1130.
  29. Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019).
  30. Ilya Loshchilov and Frank Hutter. 2019. Decoupled Weight Decay Regularization. In International Conference on Learning Representations. https://openreview.net/forum?id=Bkg6RiCqY7
  31. Graph Inductive Biases in Transformers without Message Passing. In International Conference on Machine Learning, ICML 2023, 23-29 July 2023, Honolulu, Hawaii, USA (Proceedings of Machine Learning Research, Vol. 202). PMLR, 23321–23337. https://proceedings.mlr.press/v202/ma23c.html
  32. Zhuang Ma and Michael Collins. 2018. Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 3698–3707.
  33. David Marker. 2006. Model theory: an introduction. Vol. 217. Springer Science & Business Media.
  34. CylE: Cylinder embeddings for multi-hop reasoning over knowledge graphs. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. 1736–1751.
  35. A three-way model for collective learning on multi-relational data.. In Icml, Vol. 11. 3104482–3104584.
  36. Smore: Knowledge graph completion and multi-hop reasoning in massive knowledge graphs. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 1472–1482.
  37. Neural graph reasoning: Complex logical query answering meets graph databases. arXiv preprint arXiv:2303.14617 (2023).
  38. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings. In International Conference on Learning Representations. https://openreview.net/forum?id=BJgr4kSFDS
  39. Hongyu Ren and Jure Leskovec. 2020. Beta embeddings for multi-hop logical reasoning in knowledge graphs. Advances in Neural Information Processing Systems 33 (2020), 19716–19726.
  40. Drum: End-to-end differentiable rule mining on knowledge graphs. Advances in Neural Information Processing Systems 32 (2019).
  41. Sequence-to-Sequence Knowledge Graph Completion and Question Answering. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2814–2828.
  42. Improving multi-hop question answering over knowledge graphs using knowledge base embeddings. In Proceedings of the 58th annual meeting of the association for computational linguistics. 4498–4507.
  43. Yago: a core of semantic knowledge. In Proceedings of the 16th international conference on World Wide Web. 697–706.
  44. RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space. In International Conference on Learning Representations. https://openreview.net/forum?id=HkgEQnRqYQ
  45. Mlp-mixer: An all-mlp architecture for vision. Advances in neural information processing systems 34 (2021), 24261–24272.
  46. Kristina Toutanova and Danqi Chen. 2015. Observed versus latent features for knowledge base and text inference. In Proceedings of the 3rd workshop on continuous vector space models and their compositionality. 57–66.
  47. Complex embeddings for simple link prediction. In International conference on machine learning. PMLR, 2071–2080.
  48. Composition-based Multi-Relational Graph Convolutional Networks. In International Conference on Learning Representations. https://openreview.net/forum?id=BylA_C4tPr
  49. Attention is all you need. Advances in neural information processing systems 30 (2017).
  50. Denny Vrandečić and Markus Krötzsch. 2014. Wikidata: a free collaborative knowledgebase. Commun. ACM 57, 10 (2014), 78–85.
  51. Structure-augmented text representation learning for efficient knowledge graph completion. In Proceedings of the Web Conference 2021. ACM.
  52. Query Structure Modeling for Inductive Logical Reasoning Over Knowledge Graphs. arXiv preprint arXiv:2305.13585 (2023).
  53. KEPLER: A unified model for knowledge embedding and pre-trained language representation. Transactions of the Association for Computational Linguistics 9 (2021), 176–194.
  54. Learning intents behind interactions with knowledge graph for recommendation. In Proceedings of the web conference 2021. 878–887.
  55. Wasserstein-Fisher-Rao Embedding: Logical Query Embeddings with Local Comparison and Global Transport. arXiv preprint arXiv:2305.04034 (2023).
  56. Logical Message Passing Networks with One-hop Inference on Atomic Formulas. In The Eleventh International Conference on Learning Representations. https://openreview.net/forum?id=SoyOsp7i_l
  57. Benchmarking the Combinatorial Generalizability of Complex Query Answering on Knowledge Graphs. Proceedings of the Neural Information Processing Systems Track on Datasets and Benchmarks 1 (NeurIPS Datasets and Benchmarks 2021) (2022).
  58. Explicit semantic ranking for academic search via knowledge graph embedding. In Proceedings of the 26th international conference on world wide web. 1271–1279.
  59. DeepPath: A Reinforcement Learning Method for Knowledge Graph Reasoning. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 564–573.
  60. How Powerful are Graph Neural Networks?. In International Conference on Learning Representations. https://openreview.net/forum?id=ryGs6iA5Km
  61. Neural-symbolic entangled framework for complex query answering. Advances in Neural Information Processing Systems 35 (2022), 1806–1819.
  62. Embedding Entities and Relations for Learning and Inference in Knowledge Bases. In Proceedings of the International Conference on Learning Representations (ICLR) 2015.
  63. GammaE: Gamma Embeddings for Logical Queries on Knowledge Graphs. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing. 745–760.
  64. Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems 34 (2021), 28877–28888.
  65. Iteratively learning embeddings and rules for knowledge graph reasoning. In The world wide web conference. 2366–2377.
  66. Yongqi Zhang and Quanming Yao. 2022. Knowledge graph reasoning with relational digraph. In Proceedings of the ACM web conference 2022. 912–924.
  67. Cone: Cone embeddings for multi-hop reasoning over knowledge graphs. Advances in Neural Information Processing Systems 34 (2021), 19172–19183.
  68. Rethinking graph convolutional networks in knowledge graph completion. In Proceedings of the ACM Web Conference 2022. 798–807.
  69. Neural-symbolic models for logical queries on knowledge graphs. In International Conference on Machine Learning. PMLR, 27454–27478.
  70. Neural bellman-ford networks: A general graph neural network framework for link prediction. Advances in Neural Information Processing Systems 34 (2021), 29476–29490.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Chongzhi Zhang (14 papers)
  2. Zhiping Peng (3 papers)
  3. Junhao Zheng (22 papers)
  4. Qianli Ma (77 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.