Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction (2309.11528v3)

Published 20 Sep 2023 in cs.AI

Abstract: Inductive link prediction -- where entities during training and inference stages can be different -- has shown great potential for completing evolving knowledge graphs in an entity-independent manner. Many popular methods mainly focus on modeling graph-level features, while the edge-level interactions -- especially the semantic correlations between relations -- have been less explored. However, we notice a desirable property of semantic correlations between relations is that they are inherently edge-level and entity-independent. This implies the great potential of the semantic correlations for the entity-independent inductive link prediction task. Inspired by this observation, we propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations that are highly correlated to their topological structures within subgraphs. Specifically, we prove that semantic correlations between any two relations can be categorized into seven topological patterns, and then proposes Relational Correlation Network (RCN) to learn the importance of each pattern. To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph that can effectively preserve complete topological patterns within the subgraph. Extensive experiments demonstrate that TACO effectively unifies the graph-level information and edge-level interactions to jointly perform reasoning, leading to a superior performance over existing state-of-the-art methods for the inductive link prediction task.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. G. A. Miller, “Wordnet: a lexical database for english,” Communications of the ACM, vol. 38, no. 11, pp. 39–41, 1995.
  2. K. Bollacker, C. Evans, P. Paritosh, T. Sturge, and J. Taylor, “Freebase: A collaboratively created graph database for structuring human knowledge,” in SIGMOD, 2008.
  3. S. Auer, C. Bizer, G. Kobilarov, J. Lehmann, R. Cyganiak, and Z. Ives, “Dbpedia: A nucleus for a web of open data,” in The Semantic Web: 6th International Semantic Web Conference, 2nd Asian Semantic Web Conference, ISWC 2007+ ASWC 2007, Busan, Korea, November 11-15, 2007. Proceedings.   Springer, 2007, pp. 722–735.
  4. Z. Zhang, X. Han, Z. Liu, X. Jiang, M. Sun, and Q. Liu, “Ernie: Enhanced language representation with informative entities,” in AAAI, 2019.
  5. X. Huang, J. Zhang, D. Li, and P. Li, “Knowledge graph embedding based question answering,” in WSDM, 2019.
  6. H. Wang, F. Zhang, J. Wang, M. Zhao, W. Li, X. Xie, and M. Guo, “Ripplenet: Propagating user preferences on the knowledge graph for recommender systems,” in CIKM, 2018.
  7. K. K. Teru, E. Denis, and W. L. Hamilton, “Inductive relation prediction by subgraph reasoning,” in ICML, 2020.
  8. H. D. Tran, D. Stepanova, M. H. Gad-Elrab, F. A. Lisi, and G. Weikum, “Towards nonmonotonic relational learning from knowledge graphs,” in Inductive Logic Programming - 26th International Conference, ser. Lecture Notes in Computer Science, vol. 10326.   Springer, 2016, pp. 94–107.
  9. T. Hamaguchi, H. Oiwa, M. Shimbo, and Y. Matsumoto, “Knowledge transfer for out-of-knowledge-base entities : A graph neural network approach,” in IJCAI, 2017.
  10. B. P. Chamberlain, S. Shirobokov, E. Rossi, F. Frasca, T. Markovich, N. Hammerla, M. M. Bronstein, and M. Hansmire, “Graph neural networks for link prediction with subgraph sketching,” arXiv preprint arXiv:2209.15486, 2022.
  11. Q. Lin, J. Liu, F. Xu, Y. Pan, Y. Zhu, L. Zhang, and T. Zhao, “Incorporating context graph with logical reasoning for inductive relation prediction,” in Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, pp. 893–903.
  12. A. Bordes, N. Usunier, A. Garcia-Duran, J. Weston, and O. Yakhnenko, “Translating embeddings for modeling multi-relational data,” in NIPS, 2013.
  13. Y. Lin, Z. Liu, M. Sun, Y. Liu, and X. Zhu, “Learning entity and relation embeddings for knowledge graph completion,” in Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence.   AAAI Press, 2015, p. 2181–2187.
  14. T. Trouillon, J. Welbl, S. Riedel, E. Gaussier, and G. Bouchard, “Complex embeddings for simple link prediction,” in Proceedings of The 33rd International Conference on Machine Learning, vol. 48.   PMLR, 2016, pp. 2071–2080.
  15. B. Yang, S. W.-t. Yih, X. He, J. Gao, and L. Deng, “Embedding entities and relations for learning and inference in knowledge bases,” in Proceedings of the International Conference on Learning Representations, 2015.
  16. A. Sadeghian, M. Armandpour, P. Ding, and D. Z. Wang, “Drum: End-to-end differentiable rule mining on knowledge graphs,” in NeurIPS, 2019.
  17. F. Yang, Z. Yang, and W. W. Cohen, “Differentiable learning of logical rules for knowledge base reasoning,” in NIPS, 2017.
  18. P. G. Omran, K. Wang, and Z. Wang, “An embedding-based approach to rule learning in knowledge graphs,” IEEE Transactions on Knowledge and Data Engineering, 2019.
  19. L. Galárraga, C. Teflioudi, K. Hose, and F. M. Suchanek, “Fast rule mining in ontological knowledge bases with amie+,” The VLDB Journal, vol. 24, no. 6, pp. 707–730, 2015.
  20. S. Mai, S. Zheng, Y. Yang, and H. Hu, “Communicative message passing for inductive relation reasoning,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 5, 2021, pp. 4294–4302.
  21. J. Chen, H. He, F. Wu, and J. Wang, “Topology-aware correlations between relations for inductive link prediction in knowledge graphs,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 7, 2021, pp. 6271–6278.
  22. K. Toutanova and D. Chen, “Observed versus latent features for knowledge base and text inference,” in The Workshop on CVSC, 2015.
  23. A.-L. Barabási and R. Albert, “Emergence of scaling in random networks,” science, vol. 286, no. 5439, pp. 509–512, 1999.
  24. W. W. Cohen, “Tensorlog: A differentiable deductive database,” arXiv preprint arXiv:1605.06523, 2016.
  25. Z. Sun, Z.-H. Deng, J.-Y. Nie, and J. Tang, “Rotate: Knowledge graph embedding by relational rotation in complex space,” in ICLR, 2019.
  26. Z. Zhang, J. Cai, and J. Wang, “Duality-induced regularizer for tensor factorization based knowledge graph completion,” in NeurIPS, 2020.
  27. Z. Zhang, J. Cai, Y. Zhang, and J. Wang, “Learning hierarchy-aware knowledge graph embeddings for link prediction,” in AAAI, 2020, pp. 3065–3072.
  28. R. Li, J. Zhao, C. Li, D. He, Y. Wang, Y. Liu, H. Sun, S. Wang, W. Deng, Y. Shen et al., “House: Knowledge graph embedding with householder parameterization,” in International Conference on Machine Learning.   PMLR, 2022, pp. 13 209–13 224.
  29. P. Wang, J. Han, C. Li, and R. Pan, “Logic attention based neighborhood aggregation for inductive knowledge graph embedding,” in AAAI, 2019.
  30. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in ICLR, 2017.
  31. K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” in International Conference on Learning Representations, 2019.
  32. X. Xu, P. Zhang, Y. He, C. Chao, and C. Yan, “Subgraph neighboring relations infomax for inductive link prediction on knowledge graphs,” arXiv preprint arXiv:2208.00850, 2022.
  33. Z. Zhu, Z. Zhang, L.-P. Xhonneux, and J. Tang, “Neural bellman-ford networks: A general graph neural network framework for link prediction,” Advances in Neural Information Processing Systems, vol. 34, pp. 29 476–29 490, 2021.
  34. S. Liu, B. Grau, I. Horrocks, and E. Kostylev, “Indigo: Gnn-based inductive knowledge graph completion using pair-wise encoding,” Advances in Neural Information Processing Systems, vol. 34, pp. 2034–2045, 2021.
  35. P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in ICLR, 2018.
  36. D. Cai and W. Lam, “Graph transformer for graph-to-sequence learning,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 05, 2020, pp. 7464–7471.
  37. K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” arXiv preprint arXiv:1810.00826, 2018.
  38. M. Schlichtkrull, T. N. Kipf, P. Bloem, R. Van Den Berg, I. Titov, and M. Welling, “Modeling relational data with graph convolutional networks,” in ESWC, 2018.
  39. Z. Zhang, F. Zhuang, H. Zhu, Z. Shi, H. Xiong, and Q. He, “Relational graph neural network with hierarchical attention for knowledge graph completion,” in AAAI, 2020.
  40. K. Do, T. Tran, and S. Venkatesh, “Knowledge graph embedding with multiple relation projections,” in ICPR, 2018.
  41. J. Zhu, Y. Jia, J. Xu, J. Qiao, and X. Cheng, “Modeling the correlations of relations for knowledge graph embedding,” J. Comput. Sci. Technol., vol. 33, no. 2, pp. 323–334, 2018.
  42. T. Dettmers, P. Minervini, P. Stenetorp, and S. Riedel, “Convolutional 2d knowledge graph embeddings,” in AAAI, 2018.
  43. W. Xiong, T. Hoang, and W. Y. Wang, “Deeppath: A reinforcement learning method for knowledge graph reasoning,” in EMNLP, 2017.
  44. C. Meilicke, M. Fink, Y. Wang, D. Ruffinelli, R. Gemulla, and H. Stuckenschmidt, “Fine-grained evaluation of rule-and embedding-based systems for knowledge graph completion,” in International Semantic Web Conference.   Springer, 2018, pp. 3–20.
  45. F. Mahdisoltani, J. Biega, and F. M. Suchanek, “YAGO3: A knowledge base from multilingual wikipedias,” in Seventh Biennial Conference on Innovative Data Systems Research, 2015.
  46. A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga et al., “Pytorch: An imperative style, high-performance deep learning library,” Advances in neural information processing systems, vol. 32, 2019.
  47. M. Wang, L. Yu, D. Zheng, Q. Gan, Y. Gai, Z. Ye, M. Li, J. Zhou, Q. Huang, C. Ma et al., “Deep graph library: Towards efficient and scalable deep learning on graphs,” in CoRR, 2019.
  48. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in ICLR, 2015.

Summary

We haven't generated a summary for this paper yet.