Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Temporal Inductive Logic Reasoning over Hypergraphs (2206.05051v2)

Published 9 Jun 2022 in cs.LG, cs.AI, and cs.LO

Abstract: Inductive logic reasoning is a fundamental task in graph analysis, which aims to generalize patterns from data. This task has been extensively studied for traditional graph representations, such as knowledge graphs (KGs), using techniques like inductive logic programming (ILP). Existing ILP methods assume learning from KGs with static facts and binary relations. Beyond KGs, graph structures are widely present in other applications such as procedural instructions, scene graphs, and program executions. While ILP is beneficial for these applications, applying it to those graphs is nontrivial: they are more complex than KGs, which usually involve timestamps and n-ary relations, effectively a type of hypergraph with temporal events. In this work, we propose temporal inductive logic reasoning (TILR), an ILP method that reasons on temporal hypergraphs. To enable hypergraph reasoning, we introduce the multi-start random B-walk, a novel graph traversal method for hypergraphs. By combining it with a path-consistency algorithm, TILR learns logic rules by generalizing from both temporal and relational data. To address the lack of hypergraph benchmarks, we create and release two temporal hypergraph datasets: YouCook2-HG and nuScenes-HG. Experiments on these benchmarks demonstrate that TILR achieves superior reasoning capability over various strong baselines.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. James F Allen. Maintaining knowledge about temporal intervals. Communications of the ACM, 26(11):832–843, 1983.
  2. Translating embeddings for modeling multi-relational data. In Advances in neural information processing systems, pages 2787–2795, 2013.
  3. ICEWS Coded Event Data, 2015.
  4. nuscenes: A multimodal dataset for autonomous driving. In CVPR, 2020.
  5. Logical rule induction and theory learning using neural theorem proving. arXiv preprint arXiv:1809.02193, 2018.
  6. Random walks on hypergraphs. Physical review E, 101(2):022308, 2020.
  7. Spectral properties of hypergraph laplacian and approximation algorithms. Journal of the ACM (JACM), 65(3):1–48, 2018.
  8. Random walks on hypergraphs with edge-dependent vertex weights. In International Conference on Machine Learning, pages 1172–1181. PMLR, 2019.
  9. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  10. Learning explanatory rules from noisy data. Journal of Artificial Intelligence Research, 61:1–64, 2018.
  11. Fast rule mining in ontological knowledge bases with amie+. The VLDB Journal—The International Journal on Very Large Data Bases, 24(6):707–730, 2015.
  12. Directed hypergraphs and applications. Discrete applied mathematics, 42(2-3):177–201, 1993.
  13. Hypergraph random walks, laplacians, and clustering. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pages 495–504, 2020.
  14. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
  15. Ni Lao and William W Cohen. Relational retrieval using a combination of path-constrained random walks. Machine learning, 81(1):53–67, 2010.
  16. Random walk inference and learning in a large scale knowledge base. In Proceedings of the 2011 conference on empirical methods in natural language processing, pages 529–539, 2011.
  17. Gdelt: Global data on events, location, and tone, 1979–2012. In ISA annual convention, volume 2, pages 1–49. Citeseer, 2013.
  18. Quadratic decomposable submodular function minimization: Theory and practice. J. Mach. Learn. Res., 21:106–1, 2020.
  19. Tlogic: Temporal logical rules for explainable link forecasting on temporal knowledge graphs. arXiv preprint arXiv:2112.08025, 2022.
  20. Continuous-time dynamic network embeddings. In Companion Proceedings of the The Web Conference 2018, pages 969–976, 2018.
  21. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the AAAI conference on artificial intelligence, volume 34, pages 5363–5370, 2020.
  22. Inductive logic programming via differentiable deep neural logic networks. arXiv preprint arXiv:1906.03523, 2019.
  23. Stream reasoning in temporal datalog. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32, 2018.
  24. Drum: End-to-end differentiable rule mining on knowledge graphs. Advances in Neural Information Processing Systems, 32, 2019.
  25. Observed versus latent features for knowledge base and text inference. In Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, pages 57–66, 2015.
  26. Know-evolve: Deep temporal reasoning for dynamic knowledge graphs. In international conference on machine learning, pages 3462–3471. PMLR, 2017.
  27. Tilp: Differentiable learning of temporal logical rules on knowledge graphs. In The Eleventh International Conference on Learning Representations, 2022.
  28. Yuan Yang and Le Song. Learn to explain efficiently via neural logic inductive learning. In International Conference on Learning Representations, 2020.
  29. Differentiable learning of logical rules for knowledge base reasoning. In Advances in Neural Information Processing Systems, pages 2319–2328, 2017.
  30. Towards automatic learning of procedures from web instructional videos. In AAAI Conference on Artificial Intelligence, pages 7590–7598, 2018.
Citations (2)

Summary

We haven't generated a summary for this paper yet.