Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 186 tok/s
Gemini 2.5 Pro 55 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 41 tok/s Pro
GPT-4o 124 tok/s Pro
Kimi K2 184 tok/s Pro
GPT OSS 120B 440 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

PipeNet: Question Answering with Semantic Pruning over Knowledge Graphs (2401.17536v2)

Published 31 Jan 2024 in cs.CL

Abstract: It is well acknowledged that incorporating explicit knowledge graphs (KGs) can benefit question answering. Existing approaches typically follow a grounding-reasoning pipeline in which entity nodes are first grounded for the query (question and candidate answers), and then a reasoning module reasons over the matched multi-hop subgraph for answer prediction. Although the pipeline largely alleviates the issue of extracting essential information from giant KGs, efficiency is still an open challenge when scaling up hops in grounding the subgraphs. In this paper, we target at finding semantically related entity nodes in the subgraph to improve the efficiency of graph reasoning with KG. We propose a grounding-pruning-reasoning pipeline to prune noisy nodes, remarkably reducing the computation cost and memory usage while also obtaining decent subgraph representation. In detail, the pruning module first scores concept nodes based on the dependency distance between matched spans and then prunes the nodes according to score ranks. To facilitate the evaluation of pruned subgraphs, we also propose a graph attention network (GAT) based module to reason with the subgraph data. Experimental results on CommonsenseQA and OpenBookQA demonstrate the effectiveness of our method.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (44)
  1. Graph convolutional encoders for syntax-aware neural machine translation. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, EMNLP 2017, Copenhagen, Denmark, September 9-11, 2017, pages 1957–1967. Association for Computational Linguistics.
  2. Language models are few-shot learners. Advances in neural information processing systems, 33:1877–1901.
  3. Fastgcn: Fast learning with graph convolutional networks via importance sampling. In International Conference on Learning Representations.
  4. Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pages 257–266.
  5. Unsupervised learning of semantic relations between concepts of a molecular biology ontology. In International Joint Conference on Artificial Intelligence.
  6. From ‘f’to ‘a’on the ny regents science exams: An overview of the aristo project. AI Magazine, 41(4):39–53.
  7. A survey for in-context learning. arXiv preprint arXiv:2301.00234.
  8. C. Fellbaum and G. Miller. 1998. Automated discovery of wordnet relations. Wordnet An Electronic Lexical Database, 5:131–151.
  9. Scalable multi-hop relational reasoning for knowledge-aware question answering. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1295–1309.
  10. Gnnautoscale: Scalable and expressive graph neural networks via historical embeddings. In International Conference on Machine Learning, pages 3294–3304. PMLR.
  11. Matthias Fey and Jan Eric Lenssen. 2019. Fast graph representation learning with pytorch geometric. arXiv preprint arXiv:1903.02428.
  12. Inductive representation learning on large graphs. Advances in neural information processing systems, 30.
  13. Clues before answers: Generation-enhanced multiple-choice qa. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.
  14. Extraction of manufacturing rules from unstructured text using a semantic framework. In ASME 2015 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference.
  15. Unifiedqa: Crossing format boundaries with a single qa system. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1896–1907.
  16. Albert: A lite bert for self-supervised learning of language representations. In International Conference on Learning Representations.
  17. Kagnet: Knowledge-aware graph networks for commonsense reasoning. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 2829–2839.
  18. On the variance of the adaptive learning rate and beyond. In International Conference on Learning Representations.
  19. Meng Liu and Shuiwang Ji. 2022. Neighbor2seq: Deep learning on massive graphs by transforming neighbors to sequences. In Proceedings of the 2022 SIAM International Conference on Data Mining (SDM), pages 55–63. SIAM.
  20. Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
  21. Exploiting semantics in neural machine translation with graph convolutional networks. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT, New Orleans, Louisiana, USA, June 1-6, 2018, Volume 2 (Short Papers), pages 486–492. Association for Computational Linguistics.
  22. Can a suit of armor conduct electricity? a new dataset for open book question answering. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2381–2391.
  23. Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 21(140):1–67.
  24. Sign: Scalable inception graph neural networks. arXiv preprint arXiv:2004.11198, 7:15.
  25. Inter-sentence relation extraction with document-level graph convolutional neural network. In Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, July 28- August 2, 2019, Volume 1: Long Papers, pages 4309–4316. Association for Computational Linguistics.
  26. A simple neural network module for relational reasoning. Advances in neural information processing systems, 30.
  27. Modeling relational data with graph convolutional networks. In European semantic web conference, pages 593–607. Springer.
  28. Acquisition of hypernyms and hyponyms from the www.
  29. Conceptnet 5.5: An open multilingual graph of general knowledge. In Thirty-first AAAI conference on artificial intelligence.
  30. JointLK: Joint reasoning with language models and knowledge graphs for commonsense question answering. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5049–5060. Association for Computational Linguistics.
  31. Commonsenseqa: A question answering challenge targeting commonsense knowledge. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4149–4158.
  32. Commonsenseqa 2.0: Exposing the limits of ai through gamification. In Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1).
  33. Graph attention networks. In International Conference on Learning Representations.
  34. Gnn is a counter? revisiting gnn for question answering. In International Conference on Learning Representations.
  35. Improving natural language inference using external knowledge in the science questions domain. In National Conference on Artificial Intelligence.
  36. Improving natural language inference using external knowledge in the science questions domain. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 7208–7215.
  37. Simplifying graph convolutional networks. In International conference on machine learning, pages 6861–6871. PMLR.
  38. Qa-gnn: Reasoning with language models and knowledge graphs for question answering. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 535–546.
  39. Graphfm: Improving large-scale gnn training via feature momentum. In International Conference on Machine Learning, pages 25684–25701. PMLR.
  40. Decoupling the depth and scope of graph neural networks. Advances in Neural Information Processing Systems, 34:19665–19679.
  41. Graphsaint: Graph sampling based inductive learning method. In International Conference on Learning Representations.
  42. Syntax-aware opinion role labeling with dependency graph convolutional networks. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020, pages 3249–3258. Association for Computational Linguistics.
  43. Nafs: A simple yet tough-to-beat baseline for graph representation learning. In International Conference on Machine Learning, pages 26467–26483. PMLR.
  44. Greaselm: Graph reasoning enhanced language models for question answering. In International Conference on Representation Learning (ICLR).

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: